var/home/core/zuul-output/0000755000175000017500000000000015154021572014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015154033365015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000361030415154033304020253 0ustar corecore6ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ|6i.߷;U/;Yw?.y7W޾n^/ixK|1Ool_~yyiw|zxV^֯Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{W5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁eor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!frH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{BpY]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓIgJ8@o2k'Hr~4Z(I8!H G8HNW%1Tќ^?'H(^jJ=䄸-m!AdEږG)շj#v;#y/hbv BO Iߒ {I7!UՆGIl HƗbd#HAF:iI }+2kK:Sov3b:1)'A6@\2X#Ih9N ̢t-mfeF;gUаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQI$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dT x삖A7 u/~&ӄMu.<|yi I?@)XJ7{ޱ?Q]{#\4ZfR-dVaz./f+yGNMGOK?2_~3\z=y}^G$*A! IcuR.o=MZ9zu b#s9@*иrI@*qQN||Ix;I}&ݢ6ɢ}{]x}_o>Mm8S]~(EX{޹na4p9/B@Dvܫs;/f֚Znϻ-Vڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28k&XF?Q P7Sۧ2fZ gbXfX`cKo;0*dl4dvۑ^]5|XOnI-DaLTBѥ] S0E( ?`5[Z!]nlnݔn,?WTm>C9O n6HNe">0]8@*0)QsUN8t^N+mXU q2EDö0^R) hCt{d}ܜFnԴ.2w⠪R/r| w,?VMqܙ7;qpUۚ5Tnj ۝jlN$q:w$U>tL)NC*<` `)ĉJآS2 z]gQ)Bی:D`W&jDk\7XD&?Y\9ȢG:${1`+i n8=%Ml%İȖb7AޗuV3A7ำqE*\qb'YpuHƩҬV nm=Ɂ-2=|5ʹ zi ' ׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ~'*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S%EasUNfB7™:%GY \LXg3۾4\.?}f kj· dM[CaVۿ$XD'QǛU>UݸoRR?x^TE.1߬VwխmLaF݄",Uy%ífz,/o/Z^]ݖF\\UR7򱺹...m/~q[ /7n!7xB[)9nI [GۿsH\ow!>66}եl?|i [%۾s& Z&el-ɬeb.E)բA l1O,dE>-KjLOgeΏe|Bf".ax)֒t0E)J\8ʁ,Gulʂ+lh)6tqd!eó5d ¢ku|M"kP-&ђ5h ^pN0[|B>+q"/[ڲ&6!%<@fpѻKQ31pxFP>TU?!$VQ`Rc1wM "U8V15> =҆#xɮ}U`۸ہt=|X!~Pu(UeS@%Nb:.SZ1d!~\<}LY aBRJ@ѥuȑz.# 3tl7 ]وb Xnݔ[TN1|ttc‡-5=VrPhE0Ǐ}Wd|\aD;(;Ha.]1-{s1`HbKV$n}Z+sz'ʀ*E%N3o2c06JZW?V g>ed\)g.C]pj|4逜*@ nBID f"!!*7kS4޷V+8弔*A19`RI/Hй qPq3TY'퀜+/Ĥ'cp2\1: 0mtH,.7>\hSؗ΀ѩ آSNEYdEcaLF&"FhQ|![gIK v~,Jc%+8[dI368fp*CDrc3k.2WM:UbX[cO;R`RA]d+w!e rr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JG.}Ε00e>& 2䯫vNj31c$ i '2Sn-51Y}rE~b>|Ď6Oj~ebIapul9| 3QtUqSCxTD7U9/nq.JYCtuc nrCtVDƖϧ;INOKx%'t+sFUJq:ǫf!NRT1D(3.8Q;І?O+JL0SU%jfˬ1lމZ|VA/.ȍȱh M-r ~[0AG꠭y*8D*-Rz_z{/S[*"꫒?`a;N6uilLn<Yllmb rY״͆jqTI!j.Pٱh s!:W_´KxA|Hk1nE6=W|$O -{]1Ak$ ѫQ6Plp;3F$RveL l5`:~@c>q,7}VE-Q8W70up˳ A¦g/OEU:غA>?=CۣPqȅlW11/$f*0@б 2Dݘrt +qrx!8 J&[V =͋A,z`S,J|L/vrʑ=}IhM4fG(Ȋ1{TT%41Oa{6ά/KeJJ4K6mz "H։n?CIv4)ںQ&3pfHΌx\#*H.dbM%HcWqw=Q"cUL6+@ XmVcU|wAV0~ƪqe3,ʸOM9vXK"oGk=>#*Fʰ)z{:g<)jDƪ0ǛѪAK9> +k|D8Jg1iQ-F/i{ y3^S>-qrC%X%/ d@ wIى풲vpx9 ^_Xܲ—Wo[ ^r;_WE#J"F7W;o/+dz9⊛mсuq6qG 5!4nĸBl{eo 5ڃb(& "bjv%_AW%q^qEL k=T4*EM\_DuONCֿƧƧe aJ3:8ur-ar 3 恳s>GEʊ5L{htޤ08 9W~ӡwmb;P{HrmV_>b9>1>6qLzԳP#x=[̥Y~VRd2{B{"%ٗz9kKV-㨞 D%}2 ۵jĒfg*ȌWqJ=V9:xRb;`h3t=FQ]TY~gVRJXI5 Jt꨷o,\ɲWUWլ%Ku%Q{]z޼:Dy9Ia,&qJ-0Uel(ZinIC4e}KJ'auaTGDLL'IX4_@ $jK x;K?F0:+c{j@!8!bd[zl!gY+C#} |ylLg}܀LG>Lk=9CNQĝ '4gi1K|-g.;Wݟ\7"ɔ6^.`c=i22/v1Qx=93/3!M}9`<\u-+.5Ꝧ -ч`M\e̫N a\J8@YSf,i;{'e0yJFj#0VbYW3_xK ɟst/'i7mÒy\ȳMEb?^SF;[Xx/?;GQܻ7&uBq_!fR 5fobSw5GY<' 醥E"QɃVv^`g޾ii l!.r1) :gR!(5>*oV+hDm+FolֿQp{.#v<-e9Be,I}oo #Kށ%LݘBA"R&&B:oc85V}+@j0K>=EapxxGYQ[?Ehjw:?x@<:˲҇c(@d@ :),7|c`t&kϕ$ty ҅)k*Cz=~;iNmkfY|.}^e  +`~_6WG$4WZ(3?M^3T.@rg{6}bD8P;/r4KҳTA:}3K#غ򨆬tR%pox)+  I.Ih & h%Ui,}jHL9bc-֮r`X`G74eU=L0T$k=feKTʛ60ȫ-oe!=mc{+ٶFRn2<L[I9GLqu1WAX(V׬0lѫqsM%Cm*a= Xh(\Q)id|jg.sqZ`2o*y%eBcQɛ'C)!shQU+@Brh#y7 \^c[  *!O#Cu?[`S\ڳk] whv+1 Eem܅h[KRA'rmvޖ2uz{}>^ɫXKdTd#\ٝnSVm}Ԩ`PA#i\JWc]\dvZьr*,煱rԱbjlv HfA1r\/ BG1b9.Sʄo(L.Un ў =~5+!gju#'bUnIz TCD.I `ɫMnD2,E\6]v x݆`ŗR4Iw|OJw]k~;Ht:nsK٤ .e"Ft&.Z4L"1u=-OVRAҚC!]GKVmA>v'S6YARDkk2 ,Qn ;S 0$t%9bX ",cu^p$;jjz磻.z(pmYٌv?H7}K@=P)Te`quJYP<)n#LwR=z39:@;.M,6r$h}aq};gؐ9M3}%Ley{xyw8l TuRuYu;-X~2&n=#Y[O7WrHwWv"O7ʈsH =̵2䟐xmҒ8 +3qt[6Z_r%XJ$`L|Ҭɻƺq{Fvֺ[,n(>oWeᨅ(:9|Zj)5q=6 Ѓ"Dm@yj{A6pIHmDȁzQцs|oڟw5h5l^<:69.AuM%Co^F@NPrc~ҫ:t+.rD]SJ|ƆE,_9uk7u=:(Qv `@b[eE.b/w(6ciuxyV59X9AԣkD+Uzn7H=P0!rq_y8pJL=D1OkO~\eWzH z.6fw희54fR@w$lu9M%mV֔;spCiXivN "7PBJI[S] s۶+;{# ߜt\;I=_;iwx(y#ZRc3{Rîd1vZ|9y0xEըQS򿣏 n} ̫v?5"i|@)s >}$L0J P b bY̌%'凶1#'gbǀ'g(,O|dc|e*"&췭UnZ;npYyU[K"J^c,={",4sz6ZaU"xS,z~kvQ- u~a?vG,=/_!~5%dJ=N*);XA=!"59C:ǕjO(<"SB,5DZy`ݰ&٭<Կmz[cж*uUyс?Oх~+ 0v$3)5*}&72PZ3+j@.l;4 cG;u6AfAK1O"p#ɦėF;F,a72A2BǚF.vt$n80Jؑj/wF+[6)+czPӶt# -[G0awLwʽ n F<-w#h]m1}=A0aKQ>"x{ "۬<H pl<68hi-K`٠bfsϚyʌҞ_/_|P[fVX +ef%L>VQ vs *Z֕hpީ,8pwS$͡z0qAL>23-k3Ǡ\h{Yn1mh3Ixx $z$͡Rvㅾp z_qMcX:˲(-59)Fd~.)]"iHnr0·UUL(B*Y;"rGy;#P|xdqqVpl-+<` j6˰"rnX.J4YSlKMRk^ Rrc+p$kchf_,~gu}̓6CX i @Su^M,`cs4U ^<%ܚ#A"nA)ijףsQբAFx*w$F]2#<qivKAU,֌422Dv`3/ CQ4`Ir ?3<2MfG+p=kҼ޿C)ת+BIFG$bF9rD |VE%:@熭\Vr7y2#vg:h5BSڛ_pIUY8>~ qh^.QG'䆑Rhafiwe#Êń̌[O'Y.fN!.K$Vu_ͳIjcLwmTr-7<wT^ǽ{qRڱy˸G4ܤth=QG;uBH{ D h&Rnq ;+'b$6A^8Q}WQFL)6|2Ud6Dw\rՐ]͝8̀~ъSrfpa1br@ohjykQQզZ6}Z˃ )Osgp-1mXDP9aWH_cښ@&}ϥ&/4-ʓpGrxzL/&GxTz ~v}_N1V҅<GwL:fd,ȶ=a"ܨ^ӓ騈{xvp89ׯ\^ *;CWG7)Q'p}O%˻~8 (<㌘; зO\A"0aQ/qK{!(u_1 Vf Mzcao aqN 'R 5'?U3SU/Ć?o$. ד)GJ]@Rw҈h'z_ZWC:̆ɱޗ^1Ÿn6#޶y#`0DnqĠϠo\Hsoۣ.wx4ӊT=اiж Am{ц$+By!v N({lOB. )eLʞ8lae;4 Svet*|b@JO#ޓP{BUBuv'yΞ:;@;u$݁PoPoB F';N4B= w 4X%4؁`wBIh*>pOB-.f奲0J N)n"mF9ﻴ8tww918iV~ax50+*#N2.Â~耱](9iP6䟇g'͕Xrt9ɨxvc`3Nb,>T.,^SK<^E4 \  ]-$yDWi>=ϒqOK0HFA*_/$$x%G 1a'_^Y/BRBOj MBU'3 ~fBbgsK`1KhEޭAz/!=`MөZGEzջ|dsh5ˍ` Go.D bYjsչ@GELbơeӵcJ'DH ˘Mdf/=8l_2eOlbCmUT !p̮r|B% ӁuÆWw>F-B Bh Sfn*OiO^*xxOd8b<}S%v㻳Mj!2VyG%7HHʕG)q!eZ}gHS1InH~_[T%P:ԁ?6'+EM5I&0l;ABZэT-YkA?mQ&@[bC%o?HZ1$ze6+-uQ{B6Ȥ2큩4Jjci*2,LieU᢬d} {+T <V\z \̑K9`>_ЋCdl &N-0C=O̒k1MվH{+-h3zjMc_5˚p|?ʗRij٨xĴRR(GTXRz%XzSZH}a uF(q' ]/qA b;rv$crH痖< qK>["Bk>^lfp W|R9mGOlYLMpL_hG) WV #F=|[KlSyꮛ-R-@h1?$Dj$Zf# T?tؕL$ϲƨ,7jh +* M}b~ oQi>SF״% nrFJ@̰F7@1U&r- }Rm$̳cǣǴH ͯѥ;3͆+"{,dOпUH3,Ksm{\L2#+[Лp+:>Fmא;"  A(.55P%S{{mQ+Ǵ_Mi̺{|5a6w(26m6&q@,UWpg(&YF=]pmy1D B26CåY6tś6TDGhɻƒ+~ҕR00`x%IT#/Iخ6i_ hQd~ǿ/ϩGKCr- <~8vǂ9lL?ܯx|8|>}_;w9×j@˿W_S~v5YO{^JS|+k/Oqտzz՟')'PLJ}?o2=MJRaq\ }䣸Xt\_MKͿ՟G]Ӥ'6ZT{;臽o2&C7̵`9g7F~+g^ʕ?cM^ܗ, tz!gs^&ͬR2-4&R2jiI?#of?`$|{ifbx]%W[)\Dg`*X:2YʠZp/yID}ӞGoCm8JJiKw1E_8Hb#v(8ASPIIVyL+X ٳVRZf 4#L^{YiQ3SY4UX[W#=jƏЅm;KߑZe!gI [ ;Iq9Ue[*8H"T:?KV.HDh"U2X3#F(P3WZ+YKN]˚E./4qx1ba }jgԣG W]2F(H<~MMHc%r8sBs! ؊I{;I,!E>k%-g-:pl7):Utѩ<$vAHahN(u<2ew3 pƕ;+ƛ)"F[3BIxѾ1 @%2Dڰ<-(c$Gsa[hp cOpYH4 .OA7!dt w1qZBNkk[yz:GJ17L8SE3U|ܛm&Q)XV}uL dtFȂT!bS\-\HӚ ^.>>n)p fe *%s^J.Hb2AoLמԓ1/4#;>fƺ4ř38{O,I2s'b$fbmڱ'[[ oK|kZaWM1c"NP%g=Ag/Csaİz_'NdgGV\ ADVXkȊ'Fҙ zO12SED 81)Q8~Yip2\iհHн&!8b$xwm rlĽ]-,'2GH# GšG=rZp+ :@#T\'pHbGzүlޑg{.eg8خ ìtv)B%e^xfF[(V{"[.X#0Gdv3_gV=ٱ\hYzsx)ʹC4aĐp40^2/aJXds #KZ76Djee.H,Ҙ-QJʮ.E$q /"4(<4Evz'L } Eb+珐:IgIyޒhǞ`D\KxZ 7?$W,7kn â$<cUĂM0b^DqW6Ԁv 8 7X1WY*1al-2Yt~7$86)*8pjp ;|WAΎa$!q|Gbg6`Ѵjh+eaMC*=<;І. /Ui>HhU V/ iؑXm\ ʩ/ VDq㻻+Ql*`klN'cy .آjx;=< K[x~O%\'35R EAճ4ٶ4<[ zt 1RLrCYbB*s4T(Fm'y2N>EspXS(1 FqN&w*φ1% dzQr(fj@87t'WJL\ SJftfj`bXZcp?Nֺ]$^g\a:G˼,}kp[G>a$^}j0яV-hX|4X*ˌ͵9a$2*%{:#"6-_ !"nHث]:8*t7K㟗1qpH-^EjV _ZBbAi! `up X!Xieô Nc5I Vڍ 댋N DjS`O]g`YSKG<9tR1E-mbn}H{hԢF>ga0#Ρܒy)xGÌBnӮ8^Mg2j0MO7$8">S,%Bc@UK 65t^_&q>K gCmxV17$8:bcʥ[*Йy6e:)/8Yn4=]nW񇈎y `WHPh ܐ( #|u؄@}OdJ6Y+ Yԧcܚ27xgA89~^麟jLC)t wn~I}m ,QV 3Q:cwC}2Q6-ohmhOBI$5(yH:sqlœhEJ 3؞gaKgμݎfY~gؼ^`ނx޳1]Bd)n.)8loH",u=׸96}n֟Wd:_΢cooNEGv`f#F78siz*7KD 01@}g/ܰ\M.\ VҴCj=}(}-}mm'kGA-j=Ԍ!mʵ0Eh06D{`|d~/ys*tHpLuiZfއI˜͙Lty08 SbMVɴg4\cAhο-|G5 vT`((apvy?? ,**9* *$m6Kvcu+|hբ|^5Cp ,ӎza`[Rӊ61dLu=%w Ţ.r ^J<0`T^F9m̎ 8!)^r.5!tn~׍R0 ҒSK{,F7+^8x˘iG npyRf6+jԺO8.[:I.=M$8e.]uvb6rhWUy92᭳dɥj0#WEo):-Jg65q V^ӰKd"g$8} F(NT\P(^t4Œ8wuQiǞcSP`>Yn1r ٳVRZf[ E#Dc0EQϓ3* ;_ )qO/Mκ0׳=vL'|EL=-~_$8ۘ/%:ҽ6\GCG}^YY>{WƑʀfE$}f7k&ae=}XPeV !)HBY"gk~>0<r 쑎K#ʵ#h%㫴MH66\ӼC3eL?Sw떈˓  ֈ)ĽP#>W[SYPwI] <uZȹg"_>:J2Pgk=k)E[=k)K ~?mf*{|OJP5zee|0}G#%gG}/Vk~phdՉNމY$sٍ&q\t?عEɆ` ?a w:o?jz8UE'e0^@YJ0;ϳݩ.9뼃Oa!64O& ٺco_HQ#\7Ơ%" ɐ-Yx&[)A FôTY%]^뒱ܐҎK{4cтlDDa˦MLxs18,euQ='lf7/!]U{uIуsÓ`╷e"i v('| 8ڿ4ɏht3<=9/ 1AO$OdPcc;~C.FgoJ@nP4sldo21"NJcNO};O1y(;=p0^ %ezi.*ӬD=;'WIXn $ BhMjIʨxnmH`ipYa}C-`2\u~L1&6)F jG G%kxeM{]Zt^nsKmq qY|W)_ W|zӽl^X{6U302];T*eX=U&e(Uvl(RS bGPQ77=WgzEItY%8Eqfcjpm6W/g`#0\P6TJ7e1B6Wפ|u[ɾ7IƮpMqL*$xwYs?\\]s39/ٷՏ]4"r3©_vXsmYC1U'kצvUÚ"f(]D XOM{ Jn̉V} BkϺ }~Q d }˂AIB g@TZQ|ꣲۓ'8?D`%U2l2l]W)YJq53̢fҊF+#8ɷur0~QG .r̋QUk~<; i*N:W%DNy TPro$HI! Rם*([:Ɏ5L`aM/(8{@zkZO,-UӸ.4jAN9-Rmb Q{>Ȟj**Q1&eMF+%-ܔ'3?m0A`dk_ȶ RZ.k`Nr֠0kƄkIӂym 1kUUAZ)Zaqvb +metREhx*%ZD r,[e VYM [f n?\kT仨|a8dA4nՠ :A[%RݳjAV`ov\E}ug?] 0*|щkjm_{Sfv l<(No<:p:k#ep,ѴJb|b[vuqY˱hZlu8cwH)/),CHD+BYaEb26 EwpRлSxPEouL>N VH-sm0Đv'  YԽ TBbО[χY ^l{,Rq⹠f3" 5P~%=g/4;t,sxt$2+ϨC"G[j&mԒ6-e˒ 9Xģm JNfMf[u@wQ@wfI,T}uDJa`Q97ND&q16ykC?naRE eK ^nUKWj+VXU͜W )ZV1.Xrw՗69p྄kiB!t8h,tK>9*}Q2 8TѶBY+` e#58| mЗȴVuG߫FXԉqѯ7Ֆ޽b@> )?@6Nmg.śGlڬ:h〉h4/*X覙a94/GjB`4+rPiL`AaOPN 4Nl؎_f(yKBn)Ė~Yq?wV~vaj>J4/gwgw#8 L8ՍZ]Xѧvl4ҨM͝@Rw+Bh ڍ3ק~WD&õ?`Xb1ZVo.mLqKs,F"wF᠉ʽR" &-5A#DsD }_iu z=4x<0C,1c Qgu]%wxˀ?XfiTkM6Z+F0{iT *8evd?2eZeo)A,(VT9Nz x06g\h048VԎ,#[碱޽ɒiHmงх)3"~[QA2JbL(/)VKB-#$H<24XƲ7HJ$jq8҂Z&D ENz+ 'M;J&Y\21B澑 IlUv"㣏x'Ҍjޅ >'|)f1q_o]Wœ]Sc[\37o g(!g_7 @* .+_A>*M.jtVs W%in# Օv^)!S1}X{cf I7cX O{3 bcI?GX3*]5<|{EhN޿݌sЫ,QJ"~[9\$߁Ą8r%81۠Ҍ8NI،φǗ3-gu2_RP|X)*B[|X:C `<]{5%"#/`D>ߥҁ9na!$`X(K1csD3KDiA tv2+<i`w+0AN90`#Qm5ͽ'.Pb.ׁVB1?$$(DP8VH&>^ N:|Vr *Aŭa19OHl9i9,]Z\r701i @vCT%DJH Ϭ+pPk z;u€ ^@IΛ^vOg* *^]'$3KWK >d$pv =܉MҷB= 9rkmHG`6VYMsXYQ̣'Y$Dk_uÇipulGuu}_*EZPCN$*163F@DF"҂HEx$|ZFe<%NލwhQ`Lm։+h}kŏ-*sq\A $;=t_,,Y)Ēa32<m"Lp7P-_l5Y[cѓfq 1A'P`]Eڣ)!*NW50Xe֪4cHk'jJg$ɘ`YF\`Ǚ`^s G؎rPn2껽5ȕ`Amt{1 ՟LwB(fUYBY 1yU,^# \@q g䑤 ?ÔAId ^AҪO`Yyw8:Y7~dB x"rN}aa7UF7#>ivGpҢ4= Uzh)%{JF nަc4l wD)CWš 15e|G #' D6RITX1I#2SF"o $&6SSauRH!Y (FJmt 0 ZeYX 0 ^Ae"SU*2G,VDj!XC#"i<64*ë3HAR5*YG2M $RdA8sTGv&-j8Q |,%*Mj@CX8R2r9K%ƠF0CiÖivS)d;n@a84L] ,M|4Lܪ¤UN5FʥfP8y`%e]F&kXH3Þc &$_#qӭ Uѭ 1 15SKYdUw\z~g ^R5Z3k05%6T>CY 1Z'"%EvX|?H }HSsU=o WV r0xk˫; œ<6 tmĭzYyiQ (+lë^g-VEm;ZGLn 7p}ktr^C>ݘkk!9%OL԰.ņu)Q¥(Nɥ7+.'vQ͸pHk?%nk]6|UjG㗾CG ă>}$!`(4w^5y6YWC$g|$w] a*{;Nvj{uEn*WSJ^=e *aB,)J%6=V2MT14DL3-a?C NP..T엾SLoߴiQWWAcp<ͫ|uWxuvߥx )5EӪ&|BgZ W>&6I(Y(FfJ5P @Ihq~j͒&iݟQ3f?*϶5-S^uoTWLZ8a[o?q)1T;{utGjn"waȰl \Fz;~+u;Ncߩ.~xKt;trT^(3T}ǿY7B04G.^UWdY0H!֯^JLۨ?S6zk-hof^>1C٢Xjfr_4.oܿ_s B{&VYJ )AjM_LZ5??㇤osm4߳?.!gbtzvdwi ^??POo~zwgWu~C?Z[>v?sɋ)=篯C4Ioz_5p};9G!Hh8\0V?.U!:C^#腛Cn @=vFo"v8z=se̿{{cY$h$]jь%Th<*U(N➃~쀼B<*J9l8$B8ibA։Ŷn7kEr6紁"s}Fl{5i2JA؝{8ZF2uz0b8S;;H`h1řJSjo 5ϳ7`x{{ɽVCfq=,A BZ-y[a79~-KtO1X׽=8{rN:$yAkvh!W h- CW)è-:n9r{3v]ݛ]2JZHj\+k~Ã˦fS5\CcG}?9 ^ݧWGs*ЄqPƳpB9PzCv%LV+*$ѺTWa!A(6Ab >6@{ G)C(wPfd܅j.d.I ^8 (qHCGJZU[.yRrA"] V~)P`]q0q5%ua*tAIMkd  VBqOzQ|s{R kZ}' t*Vr;HٮER0S ZO੨~!)STfl@_hp?ďhWbgA@*QE 5C\2PSԔ`]Oޘ_T79J_Pk?TI[W~(a,.E4zm{6tIzԘb%DTgbK >kFaw?OTkK>f|u?Fb,hdl0 I>>AA9 mTϾ y5xYᬥ!sZK]%6JK őu4QĞxoy=ܬ.l#RÎ4U*U8 es-X(ԝ0;=2i2'&@8_zc@R+C3|MH&ske !IsHZ*T|N\, ȕGfۈ$=Q;r6pdQ(hs|t$0qNAI0xQ?S3L &n!捛Hl:ag =. Vc2<>9!F3-*4IҢKo4AeTn{ Dh%Y#Dq58@U>T`N&bQ2|#L(}F0=Ю0mB6@ mhoMiB1]PiH#9SbCU- mho i/T #7T gS Q#^o)pEhC{ګ1rڬAL*ކ֑jb1` TyF0Uf5R0To)Tc^n2EP5\5uS Ұ)%45.PRR%4f$"r\Sil U 4SR M0HyF0Up쉤6ׁ7Tm mhomiUw,#AT tnm&Peஆ MsƎmpś@ j(Hd6(FgR` @a~\͏CUIC{$Sr| `*RPn+uɵMo\Qm|N g4 c? 'y4IP)&(*KE%)//tcŸwE1W7A\#t;`-ǻ{4Emt}/1L'_.ɰEtI'M58R3e$"tcΥ2ie,5$. Zb):3kpSkZ$UO~ƹ$sg1}PiꑛVNo ;-+*bC\6d"IB$ _w=Ɓ;l>=@!g€¹0Y9d_;( ʸQ%h?X_fWyS kCSJiLd2M#Rij &:Tef?5,FD0%d&,:b68%.d"l:%E1$Rte¨pWv}bEJyZE"23Wa[PF&QX$FkHkUUZ)!MR$bЖskbhuVb)bTI^u`3x98uFqܼh TiFt-j_~"\YjR^?T` ]6C;r>~H eip.2W >ԴkOޚO=o?P]4{fV0z-#nGQ!g( _q'wǚv?b2:po#7A6M2l6=Y د!gٓ _%[ZdbS;zl֏P8+$:sᔚMA9эW(Cid" IE!t9%iE`6wZ"YZ PQ+$շR4x"hP$ toa8®+ cEd@!o RjuURDL RPUnFB !CV +ziFV2od2H*t.t #}22Ez(EՆl2EvȢ ;ˣ1 _Q** )/oIY*xٗMU -KA! Ӿ@#Ɇv㐄Ű^!@!͌xT593٧Ni9:A "B#$ V93ܾ%ymQH1|] IGH܁RD!nZ m &w!$Bmsk 7rrDɓR` hc%`IdX̒Tl ,%6aYb5H+:P.J${OÞ _N8(/߼w3,N Sut/uK5(J8p!+N } YnQh( +$,d}reQZ^rϳ=SiYCt4tl'JH2| e|/w̗Ce¶BB)l_H2|q zfN5M]Ny=*@wA DD$C;L.y̶F{v!sG@:ky 2ZF(BH:0hű2xx{/sqnxw75g3Ma7]>#ÀQ dT8d4W>]=VՃ1~67B )<1yb|!ɐK:w˺ir-1!wl1lj(@ IPZ|nC粬]}6#:߽`!xFUT'ِ(mR8rll#VȎH 6(v IJzRڹڹ~`a\+k:̈(5֍fچ>. \ZiCVkY EC{e"\4Rg4숆6ٮTLɒF)TUSYG˜5R6J{R\K[3/CH ϴy IVhWՀ/ʦ1]c6˰1O@nmdڨEd@`Q X ÷[zjJkjJ,+G#%9'Բpa6EW9)( SBA컄W^s:Wc«v«Afh`.J{$,lb@꫐.?/#M PRVtezL-eLA[ C9SZETEI!"r}sվ*r8gp qBI"8%BH:`r*:S-rH;λid6b"BAcsv!MḙDjV~ \;\z(CBh-]ʠQFuE giLc3L[e"0:JQFAWH21`e>1:҂(/{gh효NB=°`DQ12u"gxV0Neho:rLjۉ¢s1 (.bweDŖ!؏uI}U]O'|r2#FXD4}"Ig[/mSnnwbvRNs;p"(IDI1m|j wTڛ)}!ܪA@A(L㴵b@/$PVch/ wx(TJ(c 8c4jeb,!B a K%TLP9JH2J.:^%F)(EV%x%Aٱ։SFa!$%wBNIuY ^^x*PmZ6OI (>ތa]:eQ|jSkGS$Ը_F:jVUcm%rxYCq42:$L IGspĊ쌖eEn&X@zH/@) PQGl7%a(@PaK\>)ጋP!騰E-V%`EVx~ }#^1*/$FN 32'r L@e` ktboNZfID,8J-n.Pш`dTZB<^.%)esJI,R1vQ6 ?BH:*N+9bVR.Caޮݶh~- {jkI(Sph:cZ OֿX'L HbT W$#shOZ=B9sejA<UQb7J-U(!(4I:l]xP>+S ]?BDŻe#WH2 m1+U2+Bf AJ+) *B2ni@ow&DPeiX=Jh l&HG>P8ky"G(Es hrdTP7eOࣶ<?M<ȧ3a5Fm iأ?vH,HZW6Jڈ2(ZUΖΥI ooEf7^[aaNoG?Էٙ$ڝJ0RK=Xۏ۳s^wǘ+dF|\֣_$`XjWuģ?|{9 0*zRV_η9N*ع;U$)=y"N goxGn13t$LNXmL! Ws2~ygUs¾B2v h&j9 '䅡owH)SGrcHW5@>t;؊߮munVN|Bh~S8>Xů~'^ _{30GM~?|`D_.'k&1˺ig'/Vps߶`_X]sexrYì DJ9J{UZ[wh(k3wuG끠@6@6qP镒%FSd1`c6w6K6HAe p\cx|^v ]aJ' mbCTh^{aW*k@y8%י|ujy$YRJt)J*Pk+.1(⨺ˇE@юTxbѶ4R$ HHˇB.<ʼnDA7Wrq.3eƯXBz(bQ9k3 px[!s]%YɆ0?:gpݭ_l@> P*p?M__}\ :qΝc} ǿMzt}W\NXî>Wtv;ezN1r#:[I?o|/x }mNf%ڌAA1t*Wf3f #n撙1 hge ӳ_'Y'J#< Z #cZ._ZaevǯY&3a*7&fs<~byk_fśYYga"n=ѮM9 4L: h=i,ӳ^|jk.y׈;6\_3ý=6'x=rb|bywL`\s7wڍme |15l a1V.&.nFKU[xi~*2^N7 ob]g\EW>P|Pֹm19jgWfS=zD ˄x*-ˏ&£aZa?u9@]5 s/E~kI_O3?kYP W[}}wqhWF{R }4e]˴ j >9/if:NN7cX/ 򺩅P!{(|,M!ǿ]\xyOo #sx~3jgE<[gVoÈ[KOa-lm=g*g&&ApH.1_(+_N 鰨t鰜{)0[:Ltؽ\X;f Z)ݟ3^h:LmVT"a9a,l Yr:Ps^j:l J[סV{aE1_ͳ]=BlNih+e^8kI,u~^Q̎ "`i5<-]Lo6=>Νj}C!js魎c[{wt}݊")я0h#بy*nr$l;Ek,&}{[WZ'߅nﱭ$-~;O~qnaNZQi3ԟvMb[К{úPT`eõeq0'ʣ+BaJ񖔥դh}BKP9 䁟?05Tj$=J<f)PK#wx %SQ-}gL' /ʑR זaG۟2@a:5>3fD^9Fvpm% )׋uX~'Tv҄2|Oc T(gYa6H ɘJeüx5&WlR7G ۍ㶝쳿9ϮլWEt81J)mZw?,7m/2-<΍رѻYr&f ZSwjvjrb:#0ۥb;g$!'&9ʌ|`*aF_kLڍ|~Qrs?6GevХt.>: m궃+…g^X8)2MҎ~W[@[à%cƍBNܭ2KC*baGqnIIRנfF>&,b%tLEw+Ga⣳6FZu¹GZ68n\mx3iDdwrxϴ W5 s~V6srG {Eo%6ɎFhB-%ѶLX a$WedjQEY*bs-uةt]]fG^]%둾mߎx[|7ՙǟTE)3uLokI~Wtq)26/yJpY0-ܙԁG}zw4\Otz`+g:M0h/׸PUfYxVPxP'w#,EOҸpк3-2>x}4r=2'mK#W =*'pB'tB (vBbVHmWp]1aI?Nځf\2PkOOo--e4ZdLkM{߸ .qm Z{mT~;]|ܴv*r[56!%蹚tjBü+w;ZWޏpJ kvJ p<` hJ[$f{^[`/a5̼abl3ož9rWX=6Ljhjljzk6m6gl*11eVzUm6^rHQ[;x$6poYԽezG%m iA;';hә/ufqpL 7ҪeLhUFnr>5hrFOsxr6iVCb`1PZe^~cq?tUoTf0fCh`J&X6|X@T >*4Hq2n&[k6v{#\;nMlZ.m"=! @DjDdTk8n/~)7kA)'CL[2 О 8#/.WqY]˪C}M1-Ybv1z 8I NjұZw=̯W@.]r>[V8]s`̮/5@/O]/&.;Ki {SH!d׏r|b GxOc怉!gL)Ye쿾/NO= u 5rX"&zj[-鄥&:kƊvmg\#-h%uD-Q hn`Tmo!S>9 uYmp-qg Z^.ԂI+ ;~.jqy.4r*F\:9Eo@-룻.=rO}oX"PU'NQ~oЊֆ..,´ eE4k?rDV ۿ7%7,_?]d/.3g/?9w.8u8\ y-7߬$nWe%7:}UoKE=[~?wU{_p=QׯΖE}]TG3{OWzRׂ2)yOpm}zTvoɺ-{?vgh,WJQ:Oй|C{VC=U5gcJTlznM˟^DY9 ǫ8՟ܟ%MNknhGٟTJd .Ǔ] fˏ}Vt{&uu{V~xqϖOP޿Wと~GO_+[8L-_5s;z?A,u*Ӵo|l?#)L5'Lqr}dהwoWsYޙC Oo/ǃOvſ/>xQp iH([2זWS!3k\DEi8k^ A3oc1QvFZUPGiFP=OmVYV+/֨g@&uؾ]*w:#x3IoS[GXGOR":Jv R @Q HK qQHePpC2 ɱ$73N;b$+Y2J әY͋/dvyOEL,uv*Yx!ёA}HI%^OT y1$%31$icU.!%XpY䠙`*8RW$E-E筍E! ,l~D &.\1HM(41IA1Δ2i% `>JA6D&rkkAF-HS!ΠD#(#m`N%+vRl i? = Jbc))p}cRFmp~@2H q7RDA Eb`=h, $ʚ0,x"?f؜:@if"`f95xB +pRkd+6 h9_$t)2 ϐ}1HA4  B #fr| m5L'xҌK2d&F0, =(eFx3nRf`6 ۶(hۤ6:F-0PIsjuˑcsLx l•+7OfN r^P$uE$o>N&@X*rL\JOp,7<~T 4 [p&pi_8sYs08kL.3Y"ӽ.0RIaeR aj̀xQpp.90%7a@6Ȓa8{RsH=B&`Pr17` .2To6J ٫grۥJ`1O3Y4)hn-BFÇ0u!b}L"!Jd!70] .\\K%ӔmY`j?4zB7e1^)lhڽMmhrhRSeY}wg>:ę['!G>׌qyS6),)`h*?r<~?Ŭ=xXߵ9b{ B36^ގ)ZNf/RG^jIВBĩ>r~֍὏7BAhꔈ:XBQfZ<{ 7uΎoE"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u%IEJcɉ: ui:IugAԑ{ud$)"pz9x=1eMm%~ɸ(< &缫Wۻ}7&]$fQH6uJvRLX;{W xPW*YsVwH(;aa4[2JC/۰f fߴT7WW$CۗG%O>qJHj'"aWDͤ$٩ج g٧"eOEb*ۉK1DX*PDeR+u*f, {¾YVL%vbt\0&H}@7չޒX^Uu(d<`2„>יL/e8`OOJ?>~'~{>4Os)~R.&CFCgZG>l7Sy"Nk@9%*lPA%տ[eMW7#XyGb\:(f@ >7>Y|?qЫȟ ?5(s^X.trˋnLi& .w>wLN̹4W~:j$֊hx]_FI 7Q).W~3-֊?h @ЌkIɹM>}\ՕU] GL_hdNuk&-̮a|ˇ| *F6bZ2u H\j ,kJEGp9 OQg+>~|rڳ߻;*TGCH3L` )MX >`\{eP@ii@+økA׊u Hi>hC[<êRɁ"Mcy6H•ƦpruT[8ApցEP2&Ǝ8 L63]cHrhlB>A=wET,)Ƣ`Dy4 Nr뤠in-k: o;*ymF2ù/,Бq(Ie*/ ڌNҦ΃MB[<4,uORëxi`ekϥ>ȔRzԁ4{$q[ЯwslD?dK:;?*;gZ/S >H_I7U녝|" aM> |y 7ϪN3Ύ;3oݲW>F_EcΡtm^ JnqPrS7{-&7H.̹N!0 RJ]r?N?R0 Z{xCyeu759;_NGIx5$9H~PđVؒ3Ek6\s2NH: /VCص`ҫ=!|'?}xQ3^g\2v$ FxfqzOn'X Ԇ{; @lm#a+{wZ Xx^iiLQՂUDA'ض1ؤMR5\.Qlt2WhcݫUUNB ],4Sˀ$'Il/Ɵ%|`9e),6lsFGӾSeRY\9/|vU@s[<dog]m7C~eKQF @ivT3-:|;v[{?xSA3~n#u\1 f݌f4w}VբuܪU \/Y7⊪ *fr﵆j58/X/},=quֵ_|ۀdjӽ,0O, 傟հ x4)²ŰVXN?/zYތFkoUյ@ƶv/}Y_* >j̻!oySA7yW/˶CaOK^Shdٵne]4jMyZUG3Ά|ilə͋ mU୙^.95My^ՈjGK%GO"OA0娙f>/fR o?_4B!5ϖ U?Zzlޠ:>*UAx@H 2}ĕړgNjR!toOSHzmvl[m dlJ%"ɽ2 R:#4n_W6ֺ$g1?9?~xnj\s]"WH`}@Oîbrl[5=WEoY)F(3P*tP1DXS(OsD;&"S+|6'tGqy +z֮¾u6N>zk(XN~x0}xHz2uȋk-*.17晙)}| Л[MDhXֆϸQ^ kEPe +"(lj0> =w'k ,+hPEWS+zomiup{˵27Lx9m1gڐ31ەhb$? W_JҦf>RCUռ~kfDlݻrKB%YޝRjll;o~ Tӄ&DhiNK\ҿdX|~ݰ P) ,4V? ;s $2hDRTܧ:m|}TKkUtKEIh彪jB-m~ +%0oSp-u%iZ-Yds7sM/ihFV>V:/|vwr\Δl㮓[w TK?tM?B0Kş_2GP{}RfP%z<2ǝ토vtڡa#Ɏe~~6jjtmHrv٤]^FGaVshZC#Hkר^3W^!S'I!0 Tp?C kȕ-nkXp%bJÕ&G\R+zvw>e7EzP)Tas-iz1 7N^~iEu9%HsNgqF 79n/M; 1)_*ϙ=Mߌ2nA:[|mI L<;W}jbN3g4"3씜 n؁-c\|γ %ʪ_jv/E@K2s^#y;2.sl;nȟ ȁoz'7YLLOC-!^d"1P 9'pE8cY8idg %JÝ&p2Fs r8Ĺ`10=NbA<m֎3Xjl_ZER NwV7ckksZϯ$8_ 9&rPmRM 9xBX+ıOM){l49/][|Ň mZn]x4Qj]+&/XwO,6snn6<-ACWa8釫ǚv]%hF5L&^jw|yrڅVdΧhsS@CP::P9t'>>Q7АAc+Ľ`Bs9kOsQ:%r^s|R?'Q?D6&~Oڬ~&4fě+Rkr ڼCk{_u#S+*aޮ)s)*ΤJM)&Zp(갲H"eLS4َح˱ E Hv#ӈǷd~V {Cޓ xБ_TŀiM8ur ՚VJ(1Z; ?.ӍK/ӷӇ6?va6&43rԞ_'Z(o&\5))^mQo#lC5h},1J#e\7`eO:K{WWj;XdJ7)ᬱvb;G&; QcHNOp8fްA,#Mw^=[ j$9G0$'$GUք8 Lb*[{ PCc"U[ Hb*V*:Mڙ__Gkٙz͗~39PH봢SW?=Vk*+<ڦq;!Sl+|9ʱ,=dU7bZAog&Մ*! dʼn›%ho/WMA|&$nh!E1C(i`LLڒ7kw6+ #`c~{EܻZ kxWv{kޕZeb)J24I5b#L –.m<)F()AR8 I %5څR IܕLʤ$L$=&!m'$(#Fl*B4+ԄyiZƨ k]'!״@ AƴJ x6+M< p48\C ZO04\O"B8M&.x1_uCr]1'9ĦHXCE"-ᭌ{uQS7Vn?ՍIw¨c#"ϐw*Jb2FIraINPɴ aeyQB'Z51RS$L9~zH+R-wd V;)7dJ;.2^i iMD"f+zh](djEAͧqZ][WrIOBVp}]+W}yw/ۧǧXq49.SڙXPmP1 GQ ޟa?Γd<_pfǑR{}BZ{ A%FCTLJCxGKE,$+كfU*o+{+st4NR/;Pq^, nĥр>OoX&$Ōp61z6b Gm_,+(IQ#9 +yacCvQxWGp] KV_|b K1}{!Ԇ+P!D9!Jc\!CLʼU%%c@ޯ1 #cE@lI'~f%=bT8d /d_or7'5i:Nµw&Lkp}M?˙F_=d|he 0*V:n4ylu]u%t~#f33QFK"dpTY%ݼSn,,?^쵈oMIzzZ_;lߜY,o,Y'O[Y}׫wY(GBԚty9,W~W}[t._ Zy,V-zw;qa9;߳R]c=\hoU>67n~Ac,}9 [R$W!Ip>ӂv.WGV=" /-/J8#21 - 5(":2NJ… # winA[qx]l0dT" E,rR(d4K;Q1 ލ1XDpGC b0ԃͧq P|I *qF!&Ȥ:E8Fx㖄Aȫ>#h@~9>/9t/*~C%Y-,ؐٶ0o8k~BqsӬ]Awg H"u_9IG [lrp۞9P7:8dj1AX@b %<'oM!B;+VwV smSIuq[E$nE eЃ78ˊcEbk%cH4"^cQXIe-Qh jIC;F|95\+D15~xl&ľ n *Ajv(C X|CǨ݋V*Vysj83̛ 0dS!ib8yJU7Ko nJG=ٽ;+4 y2?.d`a\PhK:D^4[|֓V:.;3.W{zehhVf_v2ߏ*j+#;6o8ߕϒ{E~޸`5 g7i5ȣY$?e? *{:4FԦDZ&1210-wޘ~f*= xY~ {<eo?9!lq5N0?92{7OsN^dɤqqwU5a9Y ؍&IhMG T{3PWhmOL w\NCK/ӷSԗ?h6mW:ųy^f?%xo18_}pfz`S-P>_Scݤ+ezF3 bofڿnL_: t=mq'_HK L^iǮyҿy&f]roerx^I")4DF hJKgԷѶqC<ױ1q'ژ uZ5O)$è89ծ;Dx P;9g՘ʅ?+ϊ.JVOt:ini|*C`Q茗aE{Z0kj,x5Ef64;|dFok$IPD|{Y zZq54&jTey =ϣƈ/sDaP A9fJ?ht-mQPAZ5LB(ݤ0y>yQ˧~rU;<X UC}*s_oh$Z&h|dKA%Z>Uj>m/ru2f,w&R^?ErYHO"0OMK Dn&bWY>UY`-_3rorC`v?H\lpz Z^ypt2Y6AmH4y<5pG| p|y SGqY`%ƒ1=St 8҅ 9vGUS39aKvTSzv@41Җq ![v-SiJE鈒R=t/O#9ۤk-X>7ʜM;kp2Dx2F OdPwYfQ=/w#¶`qc.aE <"Wa%(gghXI1čnq8W&I߱fmakJΨu2NF-*f7_eLTjGq*]x!- $7VZ0KM;-*ق {!У찪9nu9mr{5'j&posP!.PsCj yD-(]G bUҭ08xUԈzaU[y3ؕp:8=Ԃb`XIKyUQ ht jTU:`6>M} *61Ãҭ6#T!eqMpˢADP˧`yEfcH0&} E3yk3K [tp/m8w9}Bx Z>Uh)hF-IJ2 /8kѭ9oDw;<A{Q˧ Qf[E\]&X4Uo[YbK J$z1v#kdU0( tkT F󚤋jin O]HieChTz" R˝#tl${5WXYFa厪eˡr`NZF'xC4&T*8٬ΚF&LA4Q1TT45w&L}@~qs+AD _VéOsMq2TZ/5o<3JW&F7Z>U1Ղo2ӏ}Ϩ,fkNT#11Y9'bpVg7s;{u*9'gfCh Ȩ%9ttdIx AZ>UnQq{,K4`G7T%e N<7\ DxJlωEiJi f_+,*t"Hrf\p>8E|?i~狎`=UqoǞ5SƞѠ4 6ѐNGڒPi @aOyA "z[?'dOW%g ]^3w>Zء{~~3!5\.2Vp Nv@,ՓEaxԇͥ)xR n::V߬&.|O_7yM\'һbԕ<ҁ/e{;eY>QKf$pٯq 6iۗ ;] 61qx,vD;Onfcw5GFOQpj0x<*̇Un(̼ʚP޾&`7kXUfuV< q=Un5&k[3Ȱ)R0w^SE]r OhMʇO ln%V4TҖ*b8 @-(pc LQ;MQ9 b+w2Zy;n|݈Tm̬E}HÎ@p U0T.1j}RykhOo9Z纹%- d۔vIWA{Oclb4Țs>5Ǘ\ с:Ьwō'liLӯ5tR8~93bH.:$U2~CN#qg,ߵ?33GkGsDɎSq,e(Є'СczF} *vvN$ 0B",Ta[F3U%IPD\ f}a4SbvULJpգ\nGM x`bgivD $aGŘ L۫S8$͜ q-3#ցciFjS8VQ˧YsS}zg;ƔӯU0[ቩߖh ~TJY9|Z5e+Z&82D* O9=obpuSZ0X/ӷ咼P4~&w ^dw,6->|fw\a#~ϣ T裃8_> dGe&B@QA JPwS-QagXLM~ .&uי;2Ĥ $BX]ӌ jxό EAW`TY LK%"^E|3ɕIEcVIX i ђ krzÃZApT%a [HU.]7lJJK嗮$BD+W@;}uv9fy. -Nr}}>\%?rM|y#xޝNnj&v gh' $Ơm gCSvmۍcX||fN\S=):8@ .IfSН`V. (ඏO厱SP;ÿ~lFُKC8evhЦmI!&M\am6Y#2(p={׍GpߋVl'g7V Ψrt+o7.v<V~*宔˒q^CSoY% w‘M _5vк;_X%V(qQF+y\hŲc$eǖ[+= <}%]F ہ $+u.9u2.kjھ%; f"i[QSX^"G-^ʊM# > t4^g.Qyuy%ys.ڒU+yM]; `mޔb9oEESⷢ3:[S0t0<~ǖ YK'm[ \{O؎_ X>]S,#)IdlO2btܗAE)hz,3lc3a(u6ľ|v 12 aԷMoe;ᱶ^bX~sNyG̴G1Oš=ܺxwP>|ϩR_hM&;h~j>CĔC%q0po8H+ҏh;Q+9iHv9=I[[[PrX'l٧:[^yt !UWoj; }Z!BMQ^Sˣ8?I-]؜Tɺ?tʹV>R fS~Hya*F21wi*fn,Sr82k" M X=^E}!voUE')aO~Ÿ>^F!ض)^=SAi Lx; De9 }_ ,f88ٽQ+CES%VLTҖ‘jONw  q\HTNH8Z8-Z vJq~ _gdH$,CRtJp66ҁgHq2q{p]ؔ#KЂ62`!Ts'E啉Y0LJǤܦ*1s" 6$0ƖrXƸ7Zg*l{ (g䘱FYѤrR D&I8&T1E617KQtv8jֳT):P, z9C^. Mp"éƐKr Jaٳ,ͷ|pQ*Cܶ` #Ppϐ~*q190pU c:zn#{%g?!Q'Ě&by[TX晡S6t8=f/OOGV̓d!"A_Ag0dY8t✖ikBKI^3(wFn@$͜ q-̈u`mb nuǸ^>&_bTqڬ pj0A}㤣10"ͅ{&7r{e\ r&!\c8z&g,B"{"}D,:L=9 h:LXXHN{ n{ST89Kb:FCEJZs/L#oY\Yg:TY'Q攕P*}{c EU f8M <L-N63Œvd=E!"I$$: F H˄%Qj$U_>1ms8sƔELcwb!K= {̲$ Ԑ"" ~QAZ],#\//{z='٭f128%Lo4_JƇ}7I RczQ' m0oy'^CR*U1ENL==ܦ1^n{&TA(}Z4x2DX HByW,\8Ugx<"Az9i /ԥ(#H 5>x#dg byҾ3#ʇ}lTp whݨcK":"CÉ% AǍqceroăҩ8z&?8OeWЄd,4}hQUoiǫWs)^]pruyIg Tho$~a\jp]1Oန]Wi\ګ_i:z_>0~J)}UWÓ|RjM1tU0WI<4r8uCsYqIq>M|{nh>!h|>='EtPAtK~M~~ŸOm _gi"15-`Z\y<=rtWR6I5½3DZ^($:OB.8>{Gs/[C,؆Yw6p֨(Qt0CiL!ND[Nz-e=-G[;`,Gm QD=̙IQA2hMRUtQNVnU,0!|y ErB,ڂwl{X\6)8ϾH$(P{B6lCuXogc|u !s`tu!nR|ao}Iqz@n{RMkfM95(qSӊnF 7?t |^f63.,z;DZF_iwrn Qr-n񹹥L5Wñf=i+F)~Z$O7Ũ(8MȽo K˂b/fSi-컟֡;:o4Hl-8 3гA.EXn!ptȶl"9r6ˢ3 w-Cf3IО@J]EĢYì`/L*w -f$HFKUL>z;XwN`/ӝ2~DX*0kwgghnq6ש LU4DIJD3gi⒌p(DnQ㧻#* 7ߺ7_X1H8B'fi w6y#`̍hǪw<њ#Pܷe,hջ먨JکݼSd ]=j6ᨠPf4_o&iNWgZ#]jq~2ɐ rCXjfttm 쾀Xu}ܙɸ\IJVly3ReE C-Yhx[vO[1VDcUeAT[v>zkCxϹ^6}c>ݠeMocYtd!DnQzon4b!h[8 Hg$ df{,ʻ^X):VF|p@/-\]CK=Aїefi|b]  N >6:Ie'0|hNװ5Ѻy߾㒷1@(;Z8'j^NʸlIO94I5N42Ψ  v~Y/C`[~.aܻ0 ֏)w$nstl}Eƫ#zʸ“@'Rmu&}f@0A.Mq/C{t>#WߎUӜD45[Ӣسm3>Aξ ZjUʎyN1W@se^a.n0l:ӿ/ƻyex Gwp#֎ [~~ܶ6(l4a_ P2M&^G?k>-|ߏ?о/E?u翔RVR$eL @{~JŃvJ0MK4)wj1!g_~3#Ų?aA7zǑ㸿Bܗpߏđ_bV~RkwOw{g&lz-pTTuUWu#6a`^1?`pylA@Z~oxc*?mFǍڲ],A)3jv\a $g |DCW|6|}.j {;{X"l|<@l]/vz7}/t[Qsغ.c(ͮ]mur/ߟƣyZ _?~?[n yrd왿[7 :a|W~>?lm}!eN_֨jKao] sW:ly[T;ףO_I_jZmb6Qv̖ 9߶wە S/9E,M:( sJϰWurfX"ThP$04R:x^XMFyK%~Ww{PҔq%w{IޞG(=ۋ*JbCJۚ`HIaT_ɫd1nwRcv oɮ7WMq8u|>uaUKn3? ll;yy#`pv&7>ԑx?@3OzI/`a8_.*#ZT/,zm *)LQls^PoFbL=lћU2k|%@п~GI~#1ճ r0 V.+  `?|7xoșTJ`i Ə?O]YmKeCG\C4:=6:@dVkH$tb;%W~GqAOuY"5 P΃%2AcAs5=g^|^:lge( Dpw`W1Bkg]mkr)g,0ʨܭTAj-֢4%LlnP4Ju7lu b蕜d1뿡pq_e$XJȥ lsCOM3Z.>`T̫dg#P&޷( يn,M,UȎqh,:(܃Mf`)㬣'I76FEt6[( 1a=36p*)c'2$9bO1Zks D`= [?GOÉ@|?ˇN{̽{[{}_֢sr9;vlnc#Eɇy:$) v|q[oM3vOq_Qt2ۼ`,Qca3MXvD-;hД ]UE-,R⣒Xqy/jkQ|&eS6rC9K<XeΫ|K zQځ+֌t-g./%^LsT1F~#kZ#;Zhf:LZf 5{a4O#cyb%5JcB3;gm VnkA#ss[hU%^~ZTle;aGEi-clǫK(]cDPya'o@')ITa2M1gGyro̵\ɸK-WZAu*};s7Xۤ=^l;KOG.y2Iv"@'&|?̕X_G'e~u~_jGSgTΛ/.L֡:d9$@2]6 T&B.jXz/]99YזOKWGrkl˸&hmĔCd!$i0VS3\ 3/3:ޯIn|9_ jdKbG"GBGBC5P;s5/TƘK+OWHvY3_ 2f }#߻]eX8ߧPdf52r8jBh%N]'Ѐ*Bs[M4ӪZZ)%Ig;_R ~7[\JqЄtLAMY}t\{ c1l5i ]RuDu[mqJII:̳-trġNJ[-*]~ ]2[ 2BT)ͥR.ͅcLM_1ш E=%Q=`jOH p6Y{77I!FvvR(~f/~"N2ZXv{D4F:iJ\HJݭ4=P}^o-uP}FтZf"{ J8&tANcruK|ÃI>-=%\ ZpcԊUPKV~oh>pA[ƸDqsϝ/5  %'n-cٳ79a?A9bye8h1"f%(P&{lzEܢi\gەq)P)b*[iAe(ԮXs^㭌R6ӸG)QF ɑNʁ/x: m~}x+4.Qut";Kn=Oy2sā<%4Q0q@fD&޲Ho*N*,KK`]cQ瓘_}K2I(yj5L$n x^O걃>r&Yjka#@7hd"` rZZn Y$C!wk߲Csj󞐯?nh}'4r&q57͓&qDy_/>jKKi\8KwOZg I!jE|T8&z G8ZA,x xK8T:s 3[!dؘEMS;NHctgKkrZf~6P+zqU0꤫XӸ"5~DR8yN@;ܩ ? 0~_ѤxUVF6F0ư?F]Nd]}5E8qNi)Xo krq0 \Y+%V|_ؗ06T$e&EZXeV V":[łVj5xsK02gMCE.$sWvyL]8xUTƸG.g; w@{/yg`k;/_p*[$$bc?y|4[ MZER8 X|/9>?),FZƪl^.NԖAGsi\.Fhc"QӰ8a.x!GBDX &g F8s?HE߀]JneL(ݙq鈄"ioH[V 5N־uܺmi2l[E(|[wV+#r0ڤZ,< ywV)!1BAϑ*$h1蔸Up,!R؉] k9mBnq`vm:Wia`%O|b\>;Ƀ&"&Qn"72_W-]oH’z`fJ8E"F~.$6*k#w{7$cF`F1I*‚4eAryͣ(`RU{wmX}Y &Y6.LLd_v,)%fYd9CtU霪乐rMו^WZɕWXE)J۔-KQEly%CzM;uN9C!Sjg"|7\`qҟc8*pf?N)8P!c*%`@]X;~)Ƴn~Tm=dkyrw8 Ro{0\dC֭^251ʿ-(?ӍЯWSP]w@#D\?[5dwIڜ~f]j*Ź|Nޔ;l@IWW锩pS' W~ZD{MRڏ͚EhnF궏|{7iUc[]O5{2^ǫb`Ϫ 置wߞ}ۭq[2`bònLDE'Vh֒y~mjcށ K z%lpyؼ~vA<{{:ʉF_s&~sgmlj{w U3]e{`0F b;zrfl^\h͗3ȃ ۳{1 t\P%X" '9L%+բuHKΐS)3;E\ Sߞ-^C^WQ<]V62˵vRxl7gvFU髃aF{0/i~s4=(0zv$#Izm^FVdc '|aN D3"k,EV8lF]em dcLAiN:kbA?cH ArF6X0h%ta'8)%{w:)њ;qIOa`ǀs@ֺ$=dݡIrP> ..iYtxqf(-7-^Jx!,++菂L 20>@Q*KWct`)^Jο:VꔕLhfed_ Ĝx߫%dݗgV ֤Dw-HӠx eJ,^ U=+zҦjI4=Wr f!.1jiUNry1춼Jn%) \URDyG]}h:7iύ`R/{t}$qca޲x)k^ >,M 82Xב3v5Qu@X0H,լ&Y}4@}nLK9JΆ~N_1ⱖ+ ,%Fi088ƝEZ16f#\Ob#xm[UjW\q4bQ yƏW$$Bu (qF*iQSFAKG<&T,,A3:FsʘHG*08RXbGeW1h5(XP\#E<7/ci~qv9R5XH,W'B9liQy68)"ݦWV`VbxFS%Z+|o&?E|!=POj]/ 2J"MXgPhзpȶ*ۈX "7M-O',:_w),G+>}?s_0,\ӦuWaL )iq( Σ xKw_8PpZl8 ճL1t`9pڄ fe1 J6+jM+m!Z'qgAȰWw3hV!Z9dwUX %6R䕬L. za=A%) ~\"kibC*P+ ͞4iuePEqNgt5Iq޴5=w( H:LGf9F 8kRLqc}E DZ08ֆLJ4)Hʃ+llay3HP͆ W!AAF۳v ўr. /cGLYfGeuH5 G&wzrzi@tgKҎyd~M+yr<[) Fߐ2ǹʶJvGzz-%6k$`nĬOFԈpQw'7< tp(,S(!fP)jmC< #%l*y[}mO )0V8_a 1⾜5נGS/3Olh w1+Кhp%FmByŋeлv B?~mbe+k;lc+A5Q zlk-gWay.pPmmCdvxYK.hӋLL< ;Ii+ q`Jo#\q)qx7%uR X`.pd vLqG( Ay CO+ oã086Oba3 Ocr_/DȊ9-]T6#a%Eb1.0=R\!"oGzs.E-CBisG`B(w*(1aU\kZHUGR4CRYbA"[8$׆G(28]O NM:.8Ud#k_9]F/ĥb XLR+\l T2>֓%(u Gz*YX$;_w,(u ăXm1R `A(1SpQp\=A0(s{D楸DVz/>u-?~S-Gfuhe"QRlg Di1dNZX̓K %⩽7 `X*HS8 ͈H"[( aYOw02= )SIY0I[I݄>WIiq=ߏn~9ݙ08ʑ~wG z7q?ȩ,V@,z`T&J!F*P RR.IQڨә^[&m(l>kbPO;#eOD?_.Pb2 bed0!C< h=!<ɠk/dP=Tla=S:xG 2E  Ѫ*лZZfK0<-OiqQZj,[8.p:I4*k䅹ǮOI*{~a\>[yc~S\:,8_U \hpc3qRsEdփ>eq8YxO)^?yå0FRwqbb( N%0Iv@OՍa%u( &j,k$r5f1I캍`+u)PlJ.&Q9%؟N *WيK/\3/c`rrtAaXH r>Jqk:xf$k{)e!C'A3lH*9AxxW[Q%倦Gap2ءO\K 4WxICmjD.;g2WG0 7=>^]ꬮlEQjPV:;x'Gݧ֥H8.kI7윞tCEy7wԔwh8^ɚ2;8Ss5 RפŽp?aã08 ,Q/I3u( Oƈ\=C$3\jMųIu7eDT+0Qqq| $?z jyŔKttY46 Jn6\MW,uzz>k{ B#׉nPpnU֠x㔉`1m-jRF`8Q~#IdHl=IzJy81׀Lc)dS^Ӽ8M'^ U9'K OQ),1nP7YSzqT-5F&ajT*P[ZA{*!3i~0%22:x%Ɋa +||U^%;xg O2>/ GgΦ6s)QGGapx;li5˙vH0B8Rv{Gy=DUa[^fWgA=K,uR!%$4i7vH8`S# S̳I{6+f6 i+ vHIFi~S(]n _~iOTMې}{e5?⟤Rܗ5=[_&7>4qq MX?y1a y4o42ihk.s]n9<Oqʮ&>_|&fzMaL1Eb*?owj&/sxU<%{ݎaMiޏњV VdÏb |x>4wilf'yGsC}ߧ ]屹Vb|~<|;^IJ\W>L|ワ& Ϡg=qm|<ƹ>L*Ŀn|vM*c~H͆6xCnKչIח/'nu7a)xS3r&d*zOmcٿI_"j{pc9%pc,ײ +H^Ē׳ 35I]}?9ۜw$se ?ejuƓ(nmَip޶D>26O@G?M1K]ZҭqG3F~vQrb4Ǎx+ݧsK:X>0Sz*l~KWQٵȭHPt.9Lt(6L j W#kdk7=pqQЕfƷ?=iqǦ[  llbjW -;N|rqbPY60l|R1!\TP1vT5RNJ,d @dz0LtH:ޢ(Q_Ӽ Jg ?QuH:z߳.[rkq]o4NQ-摒6%Ķ H˦v2#>q[D#̰N~ {w |Cv"ȃ݊q,y9d`"cٮYLJݍiMmgm:a#Gudڄ I#WNRJ sKwcK;hkl|`,k{bpufԁ~$r,uIB16CFC ¥p#`-n~]*cGBDYģ6&D'8\"`YNT`A Q/ Nbq8_vk7w%nAc9bcZ@EAS'2O}SiĶ,# p;o CN$,8AKl$ԷwGK iׄrarF<aȳ%wA*cG:c` xA<5BO*ˮU}jw>췛47"` ma$yEG4<-,b)`\GN6n!x ]{ luwmʝ;Ql ;6ĝh^l[w۝]Gvhꝏ{Nyܪ_S38d%@@[;_שw^[5-),?ưcEfh[}m;K ٰ[.c{N7ܭX\G m$Bv9aD 2B.RXLM;;nS_Zpܫu{ zkw:;n8tnEм=mGr9f :%#*:rzh^VyL*B" LBx0}ˍo;Y)""6|eX܊T=:%MUm7S"ZrcbaVh`K5 |. QW)b ⃥6omoaOiԵx/#Om|OG9VVO 0Wg顛*X =wdwCSx}gw]νfOO?Iyܟ\~'s?2Y|GӪq~n[^Nyaֱ+ycyT;Pa]6Fg);?VYOŹc.5 (,~1Y"N~5#zC)Ao맣q@.D3_$wj%iX|=uR|=n'}ցF^3ߪ&H3>Hzr| zK/Sd\~gk? Q)gFԊUH5UrVt=γӟC.^,xJ}Mȥ=K/nƇI{ݗل9ܞ>4Z<剓yB.¬u4rC3zz{5 *Rew몡e2a68'LN &Jo CI,lja'&pz8F(& <}踧*9s7L PYU)TJ<_dB5q 3 A #}",,"W:kuveQ"|xLVF`EB$2ס!-LJO2D>|""KnSHa1E4Km\:(iZ qq/+ovT*|[Rȳp/#IȱC器܁QZ:e={SY"jJ N*X)".˞bU_J!S$r2?  L-'0b؞D|YZ7$%&Z@_Z,'z{CYY!Ȩ)x|VA̶(ı0 l OrՅCh0F_4t`a8L]1hR+ҌƋW³xrUymƍIW>,}P AU }1\52kչH-JG7rn\Nj~IefJ뜴+W3NUk8ل6v jk^D ~%bb;- ȿVEA8/dRDZ>+ֵUՙUz*Qek ?U+xz۱8TX&:)*/RXVEEږnGK aʋf]޼;E^\qrũ]S̼+<TuBEA" DY"ia,Gz3)mfK nqIH!pmuNJ#0˗z}KɎշ켸kv1ڵw1R.iU"#DApöVF_[A?sP"ϊAީwzwE?D~ƋGf|DC^V_d P=%kwxWgx`P@6 tng_^&|EjV2^/X^6^ |W m ¬(gTz}^̸9e {<xQ1ÉQh^1Jy M EZOjP^Rچ^%4Z9nSeCH#^5ta>Cބ}[#U~.H?zBOM:^K|ǸR J:g".2kkPhi ScH<r#OG.()e yq':YQeV?iR!`MU") GGibzOVCw;{a}M^ooW,4Iul!4Vu=`_+U&NG/kա+za+Y[zk+mϹu[\,ϽI e0tFG>Zjex[Hc*>rywb{q̢ϩ\ϩ\vK/1h{βvQ, 'c=&D\\J.{g;\?X%S98?矃OpEE+~[,>lXȸ}^ñ*U v8\ͅ6#l 02XLhb.3T1MWzxSߚFqOVx)UaN7b lVҬ6)IQm)Z ?LBuN6_<5,+a4g@;cAl$/"O#U"퓰?JD2/ ~ʄVO*0MX$1jD'vdk91"0ʍw5eUh<̪G 3SCb*?3[$LSwd] ?=>\%YBݤmfV8̍y&0vMxqS͕Y[U-CZrvN΂5z,#2 cuJ>W~fpg(0Wb#fTJCm.q VFt1Mo=*ͦE)pxe\?]aK0o߈ syٲQ>T?\?~V_GiTgr{D>"qqF,]2s[ ߲999#rD| Kb@7;Xv ^׋jyL|f5϶\x]>9$i3 ɓ9@EGّ=v6ªۤk qmcJsM@[,zuw>Dɒ,#X$43`!BSǏ7@ ˧o+Q)Өl%ߟ0j! }rYNj HcH(80;7Ju9ݬh2֙42LIx;)kaVnO$A^g*3Bzi=n`tUז2^[kxm-e5X=2󊪔"E@]\3Fgl(펆٣SL&,Gn[B}4#7P`bM)%֝_wZsҼcUzdi#K[YY~5zԝc+y =M.i`ٚ9 i]_<͝p}/x$;eImy鸑P5@OS5\K=' ~ ]F㽷?5OO~{d(̀6U#6TS3[-1evfП߾5;y\I˼3raA@ɿK0ÄzӘ[G9E,z!J$[먓RTd́qeЪh7/A*tH1)\{8c6!ȳCŸR)R>bNX% 9\IB8 \8.'\NH!,8Þ: OK8s~y%qt㤲K/e1^=K> {G.Bw $Q WsDW^jXD ǞKr%B)UFyy>CZ3hIEcxK1߃G=T*ncӞqeN|(jW͓QK/cHvSƐǐ6@#I6q WJY]i6FwR ^ cHkb{9aa3K.Q;~20t |D$ LLKRz1QUi:+J3f%Ֆfc9=gE,#ɑ0za"2 TLCy}d3tz4j̷JL3g PƭFNk\;4;v{r6&EwԖF-a SRr5kd:л]xm>ttW.m,ƴo΍R# fV0R+7T_z'qUZռٚrQm/ y[8o^]ng;N,>4mgt|2WQxȹlTgzDic F}֡Ǒ;]Z\pk!Sa %QNaPZ@l|8; 4 5=kiJVfh˽"-AJܡz.ދLۺVĺVyO~>ٔkz}a7-JڜZf0]cX: tW/FmA,esPF^j;Mּy69H1.F9BzJ nlmŀێW#?(P̥0N,YP)a^)i^zYc%ɵco^f[稶/zsT}7;F/j*믹ezlnN+ki'(kGoEAmdt[WM i@WTI%:<6hy@;BPm./[S](z<~ՏkZ y5a|ӿguuuuuyGn.]sLC˾>|GVSB{{_ZƱ8-` ZqlQx{kw`jm2GۋulJ ̢sR[k6H.WzU6Saȑn*h:MX@v]7vϠiqyivEPe@t5 9ojbz}S|Po\-mŲχV%U|7;BKдqTkUp*.jf?1\4#P<Қ^1T;衫-#eȺ y_?]r9.PBr2}gnć%/kַxdȠd˃R=+PQOV MR܌5PMSJJ#)p%eEAH*PEAE,8ٮ(ZLͫ2ýqpIseC.DnͰ {8>8*Q-0OziP8uu9nv*uޢpȹP a#7T>q/IW ""Feؗ! _K3Yۣ~{ 7Pso'MCE0?޻-ДcCӸMߩ4/XiX<%: պH~QmV?;<9N,^7O'KAz[Lp< HyD G(8 a@®!4O{+C"t8j|8[R&NjXp vC)xIG(A"S?t2X?<ӄԉ'GG P$z! /ќ8$ܓ%#J&"Q_QPFlBGW1˔Q"qL@ PHV0ϗQ:+ɢp_I1#BT:9o]27"kAn:R_PR"}tH rHH@#"$|ݏokO'g #D}1pq"r(QTQ컊 S>& ju@<WVrQyЃuJʇ&&(e0|ΗaZIJ1`&3&UbD) Qi# q qֵˣX!>jw4!5D Ro{lWǍZݣ{p>$~>lLᰥT@}^;;kު.MCLaNJ< X8 m@$Mw\X`>>belKFpיּ[{  CP-r텺#8 # ^<* 2NP3{ok{g'{UtY nyZkVm_:{G'L9 &~P:n%'Wܴ@-|zb> ^=\ɻ0>I|k|aI_̤8p9D&s<;UowiLBlD )㘇Ѱִ"5M$U>qR6Er:V$VG];3 `;?vZ0y3<|8;fizV+>к=UEQ/ig}mrIΉ5IDfI8ã跺 |:\G!W,xdD@$G=XWjKcE4WoLَ,EeR㇡?wj%Kw^)qƒr`Eb?&QϜ> SGEHFb;,bw6!ͧO*Q'4_9!*@( =`vSƐǐ6jXks/hljZ OR*׍Ľa6r^pq ^gBQ;AXѺ!tVL_5AQ^8ȡ6rm)G?YFKO#x6~0 Ŭq&LuzCu6+Mu?t;3 ጖z4\]s6WX@kM챝 h%y.!J>LQd3ulvmtoeFr@V#9/)nD")_SQ.ʢ\t,2Jz-I~[z> eRdZ960m*BK )9KSpNd*A/ܱ"RcN.Lҫ>m,e*yoIg_} SFwxYP&q$+(hcfpa:-(^d}˹Xh,f2 9m -tb=MqEu:hv ;;VU3}4e%bO;qrd Rk z+CO4mx=-(~XDoҾk5ɗ?Kx\37ĕ/4,U;OTc5 PVDIw,4ß8:>p'"6WD`wGgo;pP <;x|຿:|{ꇗ3a㥟NN}ng=ٕY5^[Y:5_넚$yDcLpۑ^4BF=캲p1w;l>]w;#9HIJPH|w/Rim|٨x+ ak-Tǣ]|O8u ʪq˫k5D$B"na1= : }?jxΨ9zq%uғ9pPsƷu}G Wڋc~w{O}>k A8_<(%Nqoxq)kuzCؾ@{:<8SrDwᄒ)?z]dЍaD o3~B]'0W NdՄOA^iK蘯2Owz0Ջ^&%m2-C_hDc›pHWOK#dlv>!2i Rf]Q~LOF&$B3P;,m}IGCx>ϰfp$o'``yc~[v Y)rm=j`u2ܙ}C@AD *9!@[%Xu^w3 $Βla> uTI)2n۽aq[VP\a un"$7wA~ pr ƔI|9 5#t1-MEqnڽOd\TnW׽awIDpqu$LeRv:RmJk @TȟK0thEvma N4wmMm.aWXEԖi-ښSu|GμRNxv-,({JiꦮN94VW-Ys0eõ$ynQth1%aUف%qs})T"& M0`$4)GB?D*נ $Č#w|Ga9?kGoaFhtM3[>" d[W !|?؆=A͒i=m^zcL֞ ;I0qjOS՞lѨ$}4Ldjd1O7-:hGBMj~,"^}qă[wՍis\儌Hki#Fٲ m\vo>݀cUM 3iYO9K7Ƒ(}v8 4=[ &nz~ tTga|c#;^cAFj)Et,}#?f=Oyl b]",RX~S]6Be'3L]LJ?^Gṵ;qXu鉶 UwhNٕUƶ']F9)eEpV+Nmk39'." DDyFN0cAu7u njŽ< ~ -40(uAZG '뺠`N_0k1A`4P\6kq7ɛÓ7gG'b$zQ:vK ޯٽ??vV]o1 skVVLߌN+Gx%T{ %!%:j1ct|I,0 n1Bu$iaz<;˻='?SgUo. Zd~+HP˘1UJKY `Kk[sw^\OH?^&/XOut{x~>sB0!;2= <|P B΂CM`˅F[̷ L1Hh3'o(TGΈ7+^XS^#zqN.{sz7v;-ysFa 8 #0S7 (c83 ^{*m0€VޔEH؎gw&|I~ڀ0L2>ԭ 0-ܡCCC7mQȤqEKȀ P>!lerCTE7Bbt|:9˺Dp>x+3{˖ćsĤ˺)=}V^ascdf \􆄥{UNZp79^J6н >`2OAs^ ĈWEgWlj!<{P[[ =#=h6fJjڌlp̆>CŇiY?gbӬI# ɕN#{3hDu Z5hL+4X'dAi cr W-W-Ũ" ,@Ur-szQ6Y91 "fn=+̙!Nk2VDX!A_ZXrTZȩ#,- ߢĕZSx^}^j'̌+ *v;b)EܫJYqrzN\ ܞ{#qeh]{ڇ|4Tα_OT~W H${͇C-0 &Ԓf`5ja\AA 6c-3~ٹ?aQce'pN^ PFωm_ 8\f J#3#6 HPzp-TYϥYK:\8{&_hsٛ{6mnXX# zp A o>b͊%W>~ (˲ZUs=h>K`-5&gΫ|i"]Gf ՅCӉ ;DyƒKJWSGOh$˓8`=n<6lR3BǡǩR;" /-j:a*̳ᅺuH; >n?ndC+\nY5:* .6u*s^^l^ ;Ígemhu\+|T@+&)C! Bc `Ap1zĈ$]&2nv YŃNRiF"NUW* IwgoL 85Wx*Ia_X[^'mC$.umܲyls )e!i( =c[V*fP'Pkz=i я@v]_ҟ-L!R,2;Ī51Ӄ J`XW8rK׀r& ĶOhdXħ, A$t|0vdꄍFITardLLk=Hn[ٿTM3cv˶=F37+ˬ2<3e,1|a 3e)ì!UurdZ]ݭc ѕ46pDZKغz#ԥ)RE=@BY|:v25L75k R RX*Kc?{ܶ>mMxz6wg7ffڦNo/:jdI!e7n&=ILْt8$qmcu۱D4n}}_Zq8&~rf[ w\/?腟/zv,|/oGw1ʯD ~3Mq{ݍZJKm F%M@e,I2 81,Fșb_靕. x[`muLy7`?1,zS> _/*+?޼ nmTLfr6A7r2WA"(pI].)|tf3*; =I2zp>ftq1z!Hѯ> n&%k٣}ntހ`EKsܜnƛZzV9۸7wQhNJоtb6 ^)j5*~Zc|ӗ 6  9{*-h s8&ibc)XQ'1Q4^3lq10B!$mw>|7VmdzOcg> cjeioKe/LW|s7?[McihkxQz8J ᙆMYFG/|j\&Q'&b}N{_*qOީ)oՂɭ;V;N# :ms7V~¹ 뢵-Ոi伫WZ筳;]" - 1DJ@]? })Y YV̏Wz*BJU'ћy0Na20 Qmw.^|I$}c+,p?A*z5nn\O4\rϰ*+Na+=^ S\vx#]g啟iIc5Ks1UmkI ee/9Ny[VMKAFli:0\*a=umޘhM[ۛye2M8PIR cNcx4b@ӌ{ÌN1Nj{E w:mhɡ3,9rJm9<\5Z;m{D cx-[Τ}[VsݶǴ-s^_u 4JB dt R(KoFKiY"lj 1`VRuXi l\!v`fnC-pKJˀLjt*n j,&bTsIj;,?ܩ7 ѓc4~4>ہtx>1;ukի_ _vY: '3T, e+.4_'m~<΅!sǏI?U?-b5ryLCr,B1Q8N2I4]x4Q㢜oFO!jL)E8M2T&g\ʌ B N H !L?'i!"EAvo%rioÍLCvCc:9m:.9racV"e&Ɖ~S$%KXXf5uux1dF^JBRڧE~\T4G=+)a )`b%f,Sp"SJ4OH2,%aVKmC[#gm!gn f&.XaD4m=:w~BhL&F"& Xc3F!H.BGZc2SXZ$T Q# NBTqJE0E'β4kT,>r5Xїν^֎GWf-N{gxA߳0OJ؄(X\qJXAl;6Vv.I 1!CGc&Ni5etFد ;hco_G Q4!Q(^o(/5x ϋPPƠAy%6µF^Dx0m؟idp4b 4f")YA "!FL i9OiCW_lx_%mu IHIw,VYPXJ;ijPg&LfLؒd2N8I2΃3;,%y [۟Ľ84Eob.:=ګ\톪+O}ψ\KZᏁOrѬ_]_yzV>'\ { .@^ S< RKd D?93M㑻.D ?腟/zv,|/oG0*f ~3֟[ww%xݨDb6&ց2@o%΀ŭ5ə$Y'W<a%\f3o=L!2֙_[x^!ɹ(>q)<1Y;MLbiET+|RM[t+dPU.n2Veܙ.06Ve|}FПOwHD3-`aR*\Jp%̴C:)9԰Td&iRsѲS9wG+O} g9/tVCngy$dۊ(󰃵 \1 ƞdVǏYL2c-I3F%GʺC\ZiӋO>%`AК"Ӗ/2V ,<0R'I"yot5l?0qmuk&:X 5ED) ]"N ս<)?t^}|48Oe`t~_Gh8'l4VGQ[Y#O>:&wuْj=UnU~Xhޝ<,-z,`fCT-ӛI22wYbʼnD`<T)Ō* Z׈gM3E2Lx)Ȣ7oQ*[T Jl>C"Ej`MW+3EwD3;s!4t^>buJqJ1t,;"jxW[Զas_o&>3=-y6l*Eurd;Ka;]E8Z%jkS~3 ho8?_{i-kO*x{&JIQQXWleZ)oU߱RbRqLQWqki*5' tҡ]WŹڇfUG&P*)oK'-̍>|M-w^z;;o$6v/M(kx5yrIz,<5y*y)k36K^l44@Tތ..FC/J73(?y7 kb~ߛ+3IYh%b0 ĉDs%}ntހEymS{'U_U.{V<1&r1CXCiNcIds<冩T^wug]4&.q`4"8^S!.31NIP,Yb2K (^nU*L4M+k#HH9S!ggio62%0Lݣ${6`C鯽 ]DoFN}WDc1K7hfgk ѷ 2x-''#~ 8I~Isvpߍր![Yާ{+[PvEg §rS4D/2`n1 7u:8 jq}_2q9$޸6ؖZKe/З]t*y.E#; U~ѫ}ݽt 5fvS[ͧ. Af[pZ'?ɻ2303j^Mj}ަ-HȓJCoG ފ\Ї CoGm& V*o. u+ 'ј+O.K\C7WJ";s͕Ģ/"a 5zyb("-ٮ kczCnD4޷R 0J] =*y1#S3_4E.俁v!ezr >e=OL0Gc<rz Ig"V{m\yr19s%?;l<1WH~ưuP??CDh4yëyvV ]:*A,/L5iÛD c\"-L9$T+.UH&&"Ö9e$FImX0=B)ZaZh&LʩnvZo̝ה3__]gu_(}U!Md$_WgtGw ;(9_jVEWߛ_G-ni;}w.GM\i'`nCԨOv%W=~W{8* 1B%x2u9v&ty0>L0߽#3cefٵˢIVƩwP>XaEgїwWcxwsai]9'cwqG >kYzh6a{uDS7@Z_4;Ybr|3':½ik3ϲx֊GOPJ4l\ a7qgv M!^~"Ҏ'#;@7KW kuV"-bYS>=Qm:WP mW{rlT2]n٦ys)MpMQpyɃ5ymt"|_b K[ 橫2lGaLQ#@1ygD' j(.O@h@g<9|̎oZڝÝߚ'e9  )vVZVvv )8EulbA尘 )\y&H(z9S$Gԅ*yWmb<2˹n9s [raf+ލVWL~j}CsUc˼)T]f cUn5f(֊s]1J/chJ\K1A2yI\]AF\ʘ!$AU.,2.UhÖ9;g)a;&xN.й]SM,#twHȤI)ѥ2:+r vdf5kNݬk/˙Wލg^_i"qC~! :ټXKU]U=v_@JB(Q$|u| >tzh8.a:xx)Ʈ| 񧣾$vҷvUzNxPtG ^,yx&)ph<_x0w돨2_`$岛o,`dFBg," DiL&'Z^ٶn{)>EUC6TZ(D]M&gVhXkjnjiƩ[K 3,Se`.^cT,Z宋\i *ѮFGIubTl:犫TR]uC|߲bfVФw_'ofm㣓vugZFo420wr*G"ӤVc89>ȏO]jP>g<h+!br]:U^VGfr4\׀:L\`޲s\,4ղk[1rKb6Kkk1u4_n-my`aT{HN( A,Pq~w-D" S֋\B':G)Z|nA>TmŢYGcŷA:ԮcmИ2|㗇ށ0Uv2+_.*B)#ŰVXf8-U^A2QTY;2 5I Q,1< ?;dȭ B0䃆G51Y}ƑkJC$3{;m)yHlMgkM?M=Geapgr[WUu.->יcjk5PgI[wB<Pi >*2Z e|t %}ecs["κWT|7˲n^MVOńfR b(̵@@ 4yL 5_7z~`R |65Y'IeϔDd@AMχ #kue|h2u;x{Xck=7* `h_"'XK E[}И͔UJ[ӊ|D,m"NlŶ]88AwcZϹ T:810@Y@>o>y=|!;?f}%SY?>REEbtQ{ϤMh8hx`OWsicmSFI˫˖X;Mݲbq6Sqv=Ph(FֻO yi#fW(uGI.tY靨 ǁ9\*[~! &WEp­/ ݬSK@Ī8fha⇸/xx\?mf7{րQv|Qpdn=q} .~WI"f:> */oPG sWS.lpq;XۘU16Sh$6cm/J_, WHQGfuMV4%kw2ؠ ZT/dAy ɝL%D\`\rnAp`f$yo^lMPxG d H]5|_7t 0m3pEB!J=ZBLrqqXheki&j&0SUf\w k-l(* T-w(]MXyE<湚ڎ *LCBь1|e Yt h&}f8-99mUs4vWM_AdPDv 8ﮧ71Μ`"&e윳 (O_;Xޕ6Bem)} 0X`m0Snd)a:"U$KR"4=dV_DQ?V(P)Ŕ|q76"寞r 8^ԃW=2ZG! qQ)FO5J9#R$w$HhSνcqJ,e! "m#;b}ss޷S{J_.GMJ.<>%0.G_'`?&jo~|L?O۟Xkwy$A@rj,i<.UվmAm#@qH1Bkkv>Q}bH%^`ր#EWe,4:ګ=]'qKеΡSEOx^?eFcCa 5֠jNmv7fjƖ lr)v>yxuE=Zb:_vw՜|lkYw}#f0Ytf31\@J,+[dI?q )ؙL vQ2 v\ccxǵ[memyreYX hC (('!DJ'Qe{kurR9eɁ2+Pf1_͊\NtXswκ:6-{15OŧӋ `ъXRiO^1VxdX3ˉL,TP9sBOvŔzwM8plө3 H0ǃLn2*+J@x<^ap:@  h3ţGXe5E=K2mu 4vA2[J14A`$!Șa*M'[e~r iqCxz %%" 2JrC?pBoHZog)c҆knfll#ʩQ1}j;u޵j-X3ȯbt hjҮvv]ߤ[3eBVg$"1Ru%DQI(">+ Oѻi*HWh"Q7ԫz*WEG_? ~I Fpuq-#<Ǫ]s/bѹC5^o |撚>.Oyhd)ioҷ1: ( r<) K(9<8<~,rsGއ Vm+SX|"W +5-uCٷӻStJC(H=)9qAaE*"bH;F(%əS/l;I'F!`a*UtVAZhHq oF*GXV-Z6yL]T1 TX9$LK<21,^E`œVQM3Q%jqTg~hmq;k0Yk+_ rw>KMvà PݠCġvDz(̘ IQ1@3 Q3D݇n&vz;K+h䯋/2/+QG&pGe7HQ|^64c 9=xNFC^pM^ ,4E! E8QRp) `"Z1Å|01hKN>o"_cOP*n,HP[d=5EEqUQ!Cc,+=o&QMoA}uS$ElZ7)yC[?AEp7u\\0'\J‘M&2idT 8D#Ĩare]-얀eM׹]jx<<9De$8„TÐ Nrpa95:!S(UTyx>@ۚ3d@>e_ƅF2D#")5[Ed&ciy_(88Km)fpk#Mv0L-d  zeVFuЮtQ@ QH?Z[Χ |c@B.CA,⿼b.cI0a" 1s4G! L%%nr.UV홓Wom6XeȦ):rA%+~l\{O]ΡSfEO\^?eFcCNW܉Al]?Utle|cvfal b&jAϫ;=Zb:_vw՜|kYw:Qa6f^47Mwwު}`1?|)9"J=0 ;qER7& ]/"%B3OiJK (,R)yAzٸn=g&4߃Z|~޴>X&9B)GXW쨅SBSŠ aP.D>3)~ۨ%䇘ȯZ=K|ˤ$\[V1\Z{ kkn8ŗX 9p"vb9R@ GT^KbHbU/"P'S ZFd2.INli0ނp>s xf8h08jr lu֏ÿ z'0^.C 5Kwj݅JmfyG=4e ,V>\UPՓo)|o>VB8ofq/pCZ-Bz+ΎΫ[oXj"mhGT^Q_=Vݳjt>-W;\VrvkCOo.]uȷng/ݽY;w?03ܞKI&V HOGME;k&G&C q3Aͳy}Fc O@+:,~qք<}I/xƸ|l12wDc`\`#N^8Ơx核c/_OLwtU#`*RA)v< ndl[07lγYXt;OwDܴ}&Ft:!{f ^UؤzAן|#:/BXɼd^8R<F"nXYds1Ǚ[R+Y5zlվtCqe`Ur\qx4qRanOWx!pµr+8~=,[wo!wΧ{eߦ~ZֱyaZJmjEM~v|14U^Mgٙ1dSP%ل0PknxPPM;h|rfpWKr<FP`h/#LSsGdĪ eR9r>{`L^߹!MQ9Yq?O{S.a;AK %B@w8 ɗf/VU1ZI,{\^b̉4g$ZUv\z@\ȏK+UdS *80!JL>R؀:TXt4qCh@[ՊhP֙fcGz:Ns[[OEFx7տy$MW K"%E,:IDO,!FXʘuw򤏄w\(]YI",=$/r$vrk5>x]%5AMP2^-2HC:'hPJUoVl+ZbK WtP<\!B{h4!gQ\P<3FAhzZ+;-ge:P4c ]uXUrf>jWҚ9 4I[* ZHd-HR[Hk(HG I iobjwNuC`T6oXwP`>*\B Ʉ t3O*e{}$/ ^P2͞ UρC7DGB ڄ'Xx<16 *3ޅpw!xUmzObO0K+(t!H;PAp \QC U6KT"9di3T'Bc.Iͻ>jI OH#ڍC[ŊP)y21GB JRFJz\ReL ũ Lu}$T/u^IW3p@O'`@K4kUؑH;]{Vo$45p_ -u!xHkqp^<νB B+m#xLi,U2(hBpE bVk̈́QQPevc$ ) ]elKv9xH ZFzПZL5z ]{HwkO@VА(I d@pRS{yw{|9 IY.]_BXSo*t&JhhTJk1=$llX^>}‰g$g\~FA)l"8q8E;s\w kʝ dd!&.K@B!+= |&@f_#9[Dxͣgw!x%GeP-.K3oÆ*K}$oTy/V Y.O1GB B̀KR) V&*K) ct^v!xiMT)㶀tP7ądڙHޘ`x"JN 4Q]jw+ :HD :s ]L #xZ;tsP =XW&);lP`BTPS&<Ic t-Pc>DEd/8F`H !ZK*G 5+%kWx!qὊZjlPBd 4BֱYP؊Bm+ВlHW_t*l` }f{/mwm:0B-x8k+ô0C@a)` ՃBm]j;+-%7CjAa3w›jHWF̀ vá+PwCJWTNW_%]9I[[Z3Yp9i~%ϫ9n/X8J_aף7Χ}?ϮO'Q9xQ3,HPQ`%mo5_k?r>dmbUxr4}ҧ2?|Z6[UWlگ[Vwy2GpWeu /E<{eGN\2`2'RDQDg&gǥWaA?,b&Jiqd|KSz>_jŽ_}&fEۉq!FKT"Ґ;c@Ӽ"_e>_HJZeiv19m?V?#xdp'wݤv@NK(r ngP5aց-YM,8{oVZ? /29\RMS Q0GYsL J0F^kM-s,Ӽg"UՌ[ &~y79[K2PlgI/p翁?&rx~]8pѦMĭ*_7tZgK*@$W.)d]wg6ڋE _[ܙ/?6cqox^ 吱bqRNkgGҿL kH>EeP>"jgj{I_!Gm, }@._voe"2C;Ғ 9.v2yꭶYӞl}x7>$ĸM!1QG ILf9XPUhbXqz| Id[eU8kx%Zc}pCmГc# %dZSb^jZI_;aa %Ui)tѪQ+DՍ[㩴[޷!{JjbES+:ޙuLLS $Cvfoݬlc,*-aRx2Za6ir7]upRee4]ʤ65Onb[TE·&ouqVZl$僬zIWr~{^{nеӫ'e&WH'󪑸7_$8f=˟eRW_N1$h6iwpA-k.P2ii8[mvӥmUשG'kHK(Fӿ 0 ICl4/D:lLCU=U[>]Q~b CW-<]eBtutem3/ӨsXˤ&FG}kWۦm #Ɓ[2^T4%+ w;ME8m3[4e8AdͶ=RY hJZLFjUJFÔ!:Jƌ 3a `{cc,d fdht*t%T$ZSz12*J+rtŠΐ6*ZLtr=t(t +n2\S ]"9ҕ6Rj3{ ,'h%:]eʎtutedxUu)t2JG:CVPZR2r WR*5+t]eO vl詓np?-]uCOTw %k;PSYeh[0#t|g6 -z`86` } תR'ja?3JGy}+֔CWUFk ( 1#]!] DDW|#|tR*btQ1:GBbo/$/p9)2Z1*Td3+E@N%/g*Õh-:]l3+M ʀ-2\SLtGWu;ҕaT*kr p.f0F^zcxJʚs+ˤԢ LQk+Z+ʑ ])Nnd'֜_dEH'\(R2t%F:TKiAtb p(2>t(րmס|`/ Q?2}Q1]̗K9o 5Zot2^+ϛb2=Zl)v7Sƍ岠qKN\fJh2SR19Ì&&K+VTCWkYBS %r3+!  0U{]"nCꑮΐƔ/JCW,2Z1Rtut/ +b p+&h:]ZÑ^ /1Õ,gvtHWF*6 f]-(3xte!ACWn9;RCҌodKK6o ѧN'~OZN3 X9فHW\P]Ѽ5@ R*5btE$Ck4Е |!zb|m 7H VbgY)iC(OnQ%F]ep?5vt #Dtut%$\DWL /2\eK+e2JFG:CF ) + W]ztQqJY*eI{?x9U=2Z3Ҏ{?HW"V)&6\Eh9:]er\j?G2/`7l7Ak ί>Ak(ޯ~"ԅMX7?)~}jܗ;R޵8o2֐3_g7iJ Z]خf{†<@=׼,/ͯanO5+޹vt͝6oD-Z6itXYW;bOw)Tޞ'aTo}Hq;`m׭5vV}kt$]+;#{#WRhLo:w")lv3֬W|53oLۃNOrV|ZrW<` >T.D\7=Y 6׃m`ˬzm%=J||ڎ[%vBçk^j`Pl1)>#^wQ,{w=u.h%9Ukn(٘EęRb[(tOtVva׍DBÇQlt C{1EʖUYۯjήk0]R+_ypKҗAOPj_קۏ]MX6jŭRBzhK8DH}W?ݺ0[~tɆ3ˍ1nQ?y6|576Y7zzzm.^[#f݆'L=LASWy5alO45SSr@2E̋\A`{IJcp@Y >xlωE6=e>=@O=mrd^Y'9M_4:&{ocQp3iDI[8zhPRQ2F'/r w>]%FϗH>n*/~}+bK+Hz$gQW^RG$TZFXC]=D- j'zM!BH Lj}d!rVq\4ʅCfi>s >4W!FSE" pfZXtUS) D+DSr)B!jS1@"˧ ÈVWW;˝^iNrՐTl 3t% D*JIɤK'*\"sa.BTjS֌AYȆ*zDt IV;GZaESuzZ,g,CQà#mJs+ ÚXu03͐@Y%Uetl cV2ljUB>ЈĊI-7-o\HL6.P$HY[+`IT{aJk'Z]Rʶ͝hUj|%.Ti))x$hIŭ!R Rhy}RܗZ'Ke:q؎.qL280 qZi+K`5 jF"񥀅ؐwsTLLZMQ#D,UAvP0Rvqh* `:^{`ED,A*YJhB#DW:&xim0IPF-"ZY BY #BUdlBDGb ֊ <59 iwa$pݫD D[ʫE !ri`ژ[V ,&b9hfCS[):jyI%2O&0a"`pJڬ7̡(X@0H+xRD%N  aXhl9P O0W `,$D 0 X .d  d>2">D4 Q"&򈐯2ȱdƙ3 rYF<وHab PFߌP .R"c)HN+ H i`9  D-JPUJ !#zycB@uKyh!-~3 beJ00dqLDcOлJK )](%BiH2Ӣe x][o8+F=*/"Y@ch`闝i xM=S`J&$ŕX .P<+7V(Ѱy Dm dՕ\r;&TQE|tX:5(@^hૈ;YFT6e 8I ~c0vnb6/vj/ĝFRi+C)VZ ʘ0;JYgd3T\^ s@ QJM .  ,rK'k)2#=zڎ{[;az+PH{Up@}Ɗ ڴbFiQ$F̥jകJNHp|S$'e#ecXT4I+җyX@?xZ@"75% p1ꈱ`L ôvU*m UgND k7 /֔KKcOAy0/dA] b}3. Nvo&232 C>N @]*tqtI2n ٌےhFaǦh ڶwej=Z6Kނ %+Y `ibѐfd s8nJxKTheTs!\nA7 "4Kh JT9oDsJԃ(p@HP0oDc'fH9Հc%X߰U/aq%BOuԕƄT bK[KA'ya-.`,"a`$. Ɉ"!U4[JxЦtq4e܁csr%#6#r])Z~)z@&VeHH$ 0i|tpUy]]?!]sgU߳5P燝)$ SM) D01 v;^}p$k 7Pf lG @-m1E zȕBeS"q:d%qL5(d5@=FP\‑ wx@$\0ֲgms*D;]M6_yTGW*Y47 !p۟@EE$Tiw2foẵoW8<N]+dgj]%?`j?LzпLlfn)`=6V\@ 2r8n}9;[cYUWQhcaE( XlW%h)"^Dh_c^?Vݰ\\Ƕ&.槙[v_uy60_;}_-a#e+<~vȭ[8y|5ާ$/lyyǵ)PC-ܸ:>7]OYiYr|X7mGlrKa ~>ڿS^p9~G^v - P`FsYU*[[Ce+ ={CHEz(CP"=HEz(CP"=HEz(CP"=HEz(CP"=HEz(CP"=HEz(CP\r蝓*'3: lPzz(X)JC=7^'yyAQu糜ҧpx= Q]nYm!Ӈ-xYg|d)7Uc!+W|>ZTt]hH져 W`?wOj#kO}΍/x>O+:0n>\p`wH,2FIQx뎟4ZU4;{f3y>YT`}ԅ!Y+aSL5Ì#fe"E,b%ri 'A8 ),*I2MiLd$$&4II2MiLd$$&4II2MiLd$$&4II2MiLd$$&4II2MiLd$$&4II2MiLd$$&L^ɴTчߧym~Vdw%+Y xc M>ߧg4L;WP}6 oXkWx#|K:=*Y*j";{VY_ʟԹѭ}EW1mn$@K)5M]Q](2ed.˜$bB;4}2_hNJp6z?K~]\H9דɃ_ᯀ:=”o(#;<\`Q8}/F6o.|Ӝls;|C omc{BVi [qeW\dw{뇎>kEl/n+8k}vTa 4_;t=.6-#\fSwd3_Whz?/fnoCn땯Cq?.6]v,g#p]|/lWj YV>n!(a8MU-U?|ٶ^յ Sf6绋ͣ;=no ap16; Yfu7Νa1io}uGns=  ?-:気|Lԝ]u z+5׏x[sYSs{r5уl5;m炼,u)v>.p Sby.c$r?vg>VX1jU]{(c~wVmut{C[nIo37a8r7щƟuΐ cD|g%i?(v P )/iU~G"RaօO׶py,r"oǰwămy 'sI΁]yl=.)aY ;wx:nܟ_=;|OBK]'!=}O\\|f28h99dtEI}76O?OgߦßY>@o;|5L|f8|a7_qXN^.·/i0ڋvFx }a@Vd)=`瀆/ihDE<C[~0\ȟ p; ׆0 &Nm'63% =Pga#y)+N?fɷ]>_ͥ//us3։̨ҧTD|!X&PȱAE Њ`E@j#X8[ғAVlV>Ӷ49mۣOeg ^^u~i^.y9`4OɌ"괈KJRϜz{_p_t9$rI`6b MXf^O&=[`a T*DYHE]K<+f="z?C{AH w!1}*8OpiY'YaҸS& NDG.!ly[jT̾Nw_.ȥC\c`N 0EށӸXjLBϾ }QOz~5D5oPn:?Z?)&˜+oMe!~K~ UM]p&N/ydzr{^OMvyڕLƮGu|[qVe pv;eV3Jޥŀe77o2n8q <\ (m zenpkmhw-(W_p䡦_{g *=/SdL L [̚"'.^9((^8(Cw|y0kK}F= ~׻Tn@vczu&pauenMS{rxպc4=l}~A=녢fW4ms~:q39hs=Zy졚h\y`%lr^8;"³ R|*2M}{JQR"y!Mq>חL*b%R!|i ')́-:5MV:j6#\PJL]ց!"w&)v&0eUKx t083KO]7ݟ3,13_/c4Vݶ{ a$Xhlnn=yvtdPV|J2 SuW"o$CgCNx B/$_䳠 nosh ך{&0T6LꚇdcW4\s#S6x~|8;qz`CЂٽ#Zb~ tq\|kT_8'A a-DYaՅ۪*Mt0C0.t!ux=n`e&TZ{-vU\7р8ea F2ajVS֖_kLi2gzChn)_|!UO-àk(D0(%칦 :d'݋ (&rabd1䀨p1^Fs&J1w,DDdH~~]mXؕlNcc}C6J98ZpTK|R4(k%=1^S}ac7GKk,4- bTeEv֓s Nb*~_qOM sЇcn񡎏\neiy^PIWI^YaČsH翲~s[l 3Fa91Q(4!MHHD i %"'Q5; 77YnwX \lGGHQ%gkO[t #Ъ\y,t%%̇wZn{wgjCL4FKlpi}ƥ8fOl1EJxC6g{6+'pTx9X_c.3~ܰhmoҜ0BxMďǏ ug#\|#n]0!=ty/oe8XIDnF%)F2&WMhMxYOo%6DycO:!Jʔs3ӈ E/}⥍ǷwY\;9 +/>s|}ha)ϻƱ}h|k}:N`6}ey1Wh%Uu?/QzAs^ #'&`R -thNWy ] ]I"S["Rw_B֝d ] ]zZDW {+3[ "NWr؆V=&-+5tp9 l+DcDUCW+HW *i]! R4tЕ7oE<8]-8xhZ aj1CE_2A0BRBWV ] ]),w5e]G؝"'Q|[qhtTpB_߂+-#[w.)N1Ywl0O͊yUF7<UW"FJ$P3q7ӧkN Af͵kaG2@tfmQߐy/?]iv/br½Xj/"i' SJ =/dխP **ӿ.>I1Y!;[;l;'f#Gaq%^i~[7OS™,F ձ[m~FɯszUKTǟƛAFs.{>*+=1~$LZ>!֐+¥rEү;"kuUEn]!`߳nBW—d+~[DW8B ʠt(^CWBW~@G,+[ WL Zx Qf*$)۳Jb ]!ڠ TMjJ&ʷ'T"}bh9;]!J䮾R_)zA߇?}ӡChB(YNR Еj說t(Y|fuOZD!})Oi`p>b(ٌ+8~0j]!`epC/\ W{BU+ND]`ﭡ++-tʼn$֝%ZADt`J=mUTDiCW+HWr.m]!\Al+DU+R1 YCW7}CY cԦT;5tpkA@{u/OWkRHWɀK +2(!ij=]!JVG8`T|k5P ̬ tV`a!\,V҇|iaQ}-N6ZE|߳}l{[tpux)Bt(k6tuot<8I4 eb0r Q~oTv8 vuFZq͇9~y c02B2.SJRNY+E Tu#m9582յJɁ~+8üS6vsuwn'O*hsgp:ޣyEƣkOwJ. &қpZ 菣aFt /(|<\ŴkþO!IW6U={8HBw>M|jU?Yu8+!(H_ǣ%Yaj"C4%`]͵TמĶڳ?gsC\dkSväX{+@a +&~e+LʲaZ6dVu\ cXwsW=ַEw^l^ 6]3pz?6N3N۰;mV0Yw^嫧;[/v&?ݏOrk|`>rVt@綏u8:loЂot׆ඍ^ڞ!fpfN_ |}qݬN"ǃq9=aޙs/ɿ@v=穎^{~3,'gA%‚Ȝ_~8V}my=̎>I;Y et7 )o|dOxWs2b[8jn?gjm8bld>C1  a`5vv7LڼՏM36Cur++ QAo]~QfT~>E4zSdU O aRδlPa pG6pG TpO \Cύ<;Dl5D?@˱fLCMlprHI9TA251>=c.v)8$OpGRI$q#/{A̡":h%:LWg3%YșP !2&ADݐD󏋷$}8w@2/^^feQhM38\8'jy*¤ja=+|P A6 f K#3(J}s2y7U ֝.  \,S$}}[mN(??**dzc*y ޙq~r$yyRg[{pyyWd*PWDŽP*Sii7e;vօlqFڤל) L9>D#Զqٝ753+ SpP08ӫHw[|ICΝ4oH7?0ٶt;VQ?p3Y ̌Xa|\d}V,ɫ4MVPt~~{\*|pLj [Z@\\_6-xN$\u6_xd33f4cBD|<λצmJ||#ʧ]DZ;}anKJB]2W*2M^cćǏ?AI~jB +s#7J5Rsp-ITm-fNXLtud[bh(YJPk[}:&4>hrQn.(]^IE|+}9f.# VX7:> UZgZs3fjX>x7)Tr6 VˮY_v<>qݬ#ۥAٻmlWX%i( HNn&vM V#KZ}ɦ^Rԇe;")KbfD| . y=1x;Z/kޔeB`~ WgkyMO@[Uq0p95t AJ]ByM'('|e;{i=NnׄFG-gXC\nttDTⰸ9" Ž52Pyn-=`lmu޾gz6t~t%<ʺ/R;-]Vw%߬]#]Is}^"䥡+뻢,th,NWRuiHwt 0]:]e].]eCˡl(Ŋn*oSq/]!`/ ]!\4thu-NWڹ ]-$tp lwnuUSqİP5Ro*TTh0P Y-ƹ~wq iHV@]MD$2tT ,ڑVAWNKYh_qnfDi#mH!ht]Bh3VVZ{ ޵U@MXt\2;UHe4\7#_ʈS/ǹ2ҀJ!D {>qqśTsh~uxpP;>;~wpvV;;:?8T;q4j؁Zn+to8)݇g=nOP|"s&f2MG]0 ܓ3Q8m Ȝ";`ѝG:R'R?riQ(Qab hP_Z-ҳ @& \.&j2c(z&E2OQiZDT:VnĀThUln3{4 oVеYMf}B}¿^W龰z"O-.Qקgow0-tZQɞo''g^VbnNkoҩ.ەET듗O[Oʰ;F?IW zdU 0[0_/;[ mŕ?'-V:VFAvd`6/u%4mʭʓt}D9P Z,>*Wժ}/Vr倃fsJ9A8b6!cg*6aCueZݫ򤂋9:Nдiظa~ bxQ:И@auuֿu=kT üYCg-ہls?(髣q|>|R' j$BMVi7Z_b1;_:4_(uxrD A7씼wmԗ]t.?n\N"مwpق{~WDlK}T-gZ΄?gx>v^W} ?}1AWV^TC7DOu$S9c>\8V?RKW0>2W*uaԁ1wk^ r(=:|?hG-%ħB︍|0{yrzdkZ#CՊ$hᅩunIcO8.x~sjiBX]MuvV%\ gfȯ(X0U .4PS My0P͟}+#~ lR Heq$~ f P 6wR.~ xsDZFM Ǽpp@\QUhQu{@5Fhs"nB][{BstuUo#%w׾9Dn I^Nɸ]*8x`;9bw] EkLi0M\νēQt ]7㮃10:_E{`ʨ_UJ}qrFyi".n7[-y|ΖmOKH+ř|?y1;<=ϕ2-EA=nOǎ =9XFek[,]Xso5|ɲ+qqu6 Mնv*w1'㭡^"P0G?{E `^lelh2rLu!>2/9sJDW/]!\!BWv\|E Q2gCWkHW\JQ+y?Fr Pnj .DtEy]Bs u)YuBlCWHWtne+Dtن֐|I է+\++Kvh=yrՔ7tan+P4tp}MM6| Qu+)=$, ]!\)BWRt(7;]U ,gt!Lp}d*ZwIkWPz+FW,] ]mz*tV&N:oz8R /LɆM% + 㐱e&9l=y^݈JR.ן MQQ.*qp ;}qJzT:x*!/ֲb-{ Yj0? #fm`5qjd>v^{9[ٍ noUlOɵ͜~ǜz'ED[!vh:.PL!(db#sBէ[MX;Ԩ[+zǠ]1 'z˘ P |z޻g  sk`pZɠ>/Ԛ&"j\I*A?:sf#P?.7:tX8U]WƄ?l_0'U3’E5gzלuEs,%ZrݝJK6|2_30ufCeX Zx _e!nbưٛgs_ڣe4 C1A%u\;"L&Zx|0߄Klo3lOw7KGrF_Ӹؗ9p< -cdO"J6q4C["Bny!.YV$y6ZKD0uhi r,th}g QJ5+c镈0_98/B) ]!1+DyZC29sʳvp=,thʯ]J ]!]wy#|p[B[WRn֑֮p?Z6%D^K툒 ]!]I9)•,thէ+Dnʙ*aXdt`rU&]$~&] b&؝S+AWNr6tf+`]ey++ܲuul6:ҕt#X߀.04N=Zu5ykS%Fr, 4 оD͈(Hm6^אgHsܕEb+w_>IW(h>ԍ(I2WH7TB(04Dh}st54v[AqPhA_%/y*%qS"cTwVw/%:^+l5vN_?ډ&")v֞}=I7`{]7 }l'&şO^uh?šwݶ 8 afG0- Uzڿ/鳓8pugJLlRݥ2΄ JL)1uZ^Խ'2t,a㟶]dNA9|uD}Ngggvzz<c7Ĝ&xX}%,:NHkU:ƨGↇxUIG`4ƞi/'>yVT>z~d:N?;*'n 0/Z y 0G_;*4$ѨT2*E0Y2fՌJlD:Wڽs<P|[ 0Z-äO( BR bZ#b"+3 gBR2BFTJAQh!4*99#B;:a$<-݀0JE~~^tp-e 1ۭ)iR@M0C&=vӨB>ޮ$og=[tugBkVۅ+7Ȧߜ3F @@Ⱦ9Ԩoh/IlI ϞgE[|9#?vOjgԎO@ⵌ96~5u$G }T &yGAN[("@3P\b$HjiFF(݋Ws >Ah90PpK3/LzLqfp x!JED 7N7FADJctP"#DeG#{~YP?s͛CHn@$eak8 rqrq Zxv@z#0BuvݯeOQz . ב ?ZDWEUX ÃyԬp"n qט<}&AW BFKå&"8m+E㹙k~Lu8c;k{mȒ*NIQLIER̶Im o„r=>;>y"*!NU,B HlUdN8C1ńT/Bq@S75sjI8;f;C] fqDjἻtdc[6Ժٽ?Nߝh4`L`g D 08u ˸Ȁ g](/u(Y0˳/.aB;8=vyޝ~|sO~Id " 2H38@ ab0 DIbb+|B-_^P΁4R UPyơR1Bb7!I"D4^`!WX <B"ʘVHYpI 0xD)*%C4 wnX[=mݙzGsL9ԯpzD-qwFTsUo&jV5Ccfe12bOI&j_>-2 _`qc:W3]H1y<D^tC&c*Y!Gwy{>?rgJV"RImsKjВeeN#!T ||;3t{AG|WlQF̞ çOʈQaߦ5GU#z-1W.a3^klr+E l,i ny(XCl8mQ2\? |t~תxAY?חe{Vn5O?C(<ܡj 0s\͋+ȮWnqWx\RI.L!]]p}DIimX@JHWzNw#N/n3K@Z̶>wR45g1CpcvlgK+@Jzp%\J]prvLl;\y)v**$d  \yqH^*jR&k+Rb ~k+/.gKKٰW W >sy%{%H/@hv'qaiDDDYGY@I!:mD{o>:&b(4h?/l[ِcEAL E6P!ghasm4 7| PĀsc̰`<tD7^*ύ}3M {]1HbcgvǎF*F4 V"PƁ?ckwy) X=qo\khzG@% Z64)Aǎd,HUHsDAHNj{~)Gņq\O0>U޶p|W}bdS5>J}!oɛ=~4MMO@JȣmsK&@~bxk~s/WٵZN?LQa9$ol oa1vfs78˙$8ٌ„8+MPV).zuNhH4}ZNC,O`K"1̱qa9Ns% A8T)v(A'jW"@ZζJNM$W*=GgW8&n_VԅII[XWOzp`JC >n L xQ hi `>qlyV*]GyO 9Ys5:r}kQ[&@7P'@IqN87U`8IJ UQO}`#yLbei^7,$ZW{cuUqWeB/e%Wz⃖VڤXgKM`U )E?!V;bWԋ+ծARrԸUD*-:䉒aotF/F=PFRZ P͚[ы7aUgj!+9EkgbFgT44#UvbYT3ٹ~A 3e(WzCBB!GebW</ݣR5hOv@`3&.W^Z^Jzp>z•XzW]+&dKDWR+XPW \yi;\2m4p,p%PHУ4@b܏}ϼo>;ۃL9y:?{eX^l B2D "__-( 9EE͊i[7zuXcbJk-U0ǂ JBíA #1WC1gkusiڦA9 E873}Ir8HDuh FURjE*d]$,Hc42Hذ(p$TSAb* fqN_,q*%hQ]zHvlʅ_Ӛ5q7tA5-[Rʣk}@E᭥-p҃ u/gI* \n?61 =9`LQzm/lZͪ頤W&sB}ԦHK Jff)I9Aeb>jnjCuoG?kM8p8ol aNXC0G4r!#𓉔IKa&*:#I9y v;v;U%i)LqUǽnt8/]as(ԄDk(p@|dᷡ0z.}:6{zϽ9 _)꼒YE+,Մ(5VbG£Yͤ9(^[s}N=횻EM96C>[@'Œ+ӧ3w絀/IoW"TI;Z(dش1x#C&CNS hcD+-`\ d:{0(6~g] ݟTݬ>+\-_Z&U>@E`tnc#6ߎh+ Zc3h=,?xeb,Ow9~ʹ !kJٱWF6B S]r=;wwА6p)1ѳx!]kLBಒ+ӵ9Lz3C]&)pB%FZBu ˆ1&\2Ui.Se> V\sPtS| W-%@KTF^J1N[ww!Y}iD2X9g#id"sZƘ8Ҋj``"˾^Xלi>Sgzx3g8%o郇|lJWNjvUwx~:_Jֽɨ 曧GcwΰBe쒪Yz8zmnfUrh^`.UZ}2ESP:I} -zhIyˊN7 d%L[W{<-lBͅ_uLh~c{r`YK*βq^iudjN?)eki\HSN aEa}L>R} 3qp9!s3Sw q&3l%Ȏ7(M(7 R̖ 4( nGG7Anj@ 7,B6}חxX4nDe7,RcbNUJR!L^Ռ7ljR5ɒ鱫+TKkm#9$2rƻ9/X'Q-qM )ٖ_ I4z %jtOOUW_-Mֹ}f#7C6,KPԴ8Ny{ }zhpf.`Vq{E}.k07oy:gx٘B 7?zЌrfjAcmhCO 5Z-ϟH:V#첵HV8_D)XWh)Ss]o^N3~~[Tb`ʶB6|/7n{V'ԃIL t*R-։IsDJ)m (ٯx7zQ*H9PBKk ph QG!"tJ]2$h\WX{%Ls^u=ƒal@j8)b .jܴЄj٦gۇ]wca44o7$uI,/Uffyd ]9\Uϣn;+{ΨZkE-g6#1fdnU[a"c]Q["$Z]Xm3F. *4@l:/v(t&lt@}2@ Xg\Fg}N1 dƟfKv|[qX{[8ݶ|Kܰ* J1Bnrio*B$7lvnô¤#/ sD QY'|^Q D;L O0ɻ$=$8K]5ϗBt%̏[m3ͫy1y:خbh %ՉKYJ{zK:ٽH{]`1-#VyP^ Y)cEZ/ʍTytE&&!ǝXqHo$6B|<ٜ҃()jRԔ)=ɩʘ8erId=II8 QQR ߋZ+PѮ;-..qGV]{YVӑuvz5 in8?R60#U3β]J%Q3uB`JMLXV'z X=(7aI*7 D8R9pШLHP ZTB<|L2 s}hWќcEǾ!@ RpjҔG-XʊFE5Y&4wmбhѴiEvia;P8[DUDtu{Ŀy_snuKsr 3޽ş=1&xJEZ 5A:$@q@CWA‡FAW*|L`$82dD"T6Q&ĜK.j SI ѽIȅƁwϹIq2Q[%QL|[wnose 8OǗED8?}3nIȐmmfx-+wt8.n91(< j LpR{a8a*丿A`KkuNnvBU L+`ZKІit҆ 'Bk457Yc7j^tTιQճFcQKTVextE e`Ix?OFHcMMQ$-Jd x* "\ě8^9cQH^:TQ~>$Õ-L[̲WwgC~~="cGST cjaGo.rD.L0@KR1BTޅ sOr fZLNPbӽbS85g{Os JQN8PN99:Pt g~xu4@#xvC[E x95R@ͩ+A8n⧣,E0ݧT_Oc!V*ې:w!_5իhC5SJS>LqGVWah(8[<ssi^Ȟ_CFr~_<ꪽg=9epugFuљ3y0Ji;^REmEԍ;kAHGKΖYW3n;YwFFQ3x4b *#[?dW]ϊڒEӼs K'!r+U9`jܮٱR-7U(ho608?EqzwÛ?w?;˛woY82IJGoWMoߴilo47M+S|Mb|-qc99dW?_~nLC˹WnZ嵙٫s+Q+mTkjk4 Et_ە\\ 8 1!‹qiy##ZlN*AT47p08?@ w-1 ?'Đ+c {4{?)=o. -4~xKEEHzy@m Ɔ&@QȥuH "ۤNׄ:SP&PVzYګ;W߹RY{EU Xҡ6Cku:ԬHq~6 loƓ>(T3Lټ/l|n2A࡟l!qc( Dh,==hS}b"AdTSn~;pwLp8;*vN}d[@k΀ʘx8$GZL-sjE  vwJ;k pxC4^^@L^3ἜbZ.]z ]p>p!`1WXm3F.k#33Y ]x lrFjL0OHexetGᄌ]SV2Q j@adXmm+q>';*G;>C8 1ߌMWaPQPA)Zf\vHnXѠn Ťe 6#D%gj8RZ Mh03w]C#84x0`&9Z/v\>[':8wǧͅ>7wY;~;&Nwb qx~`pJ&f񆢑^Pcp[*PHV'S.yg+-u,,Ft=6iBeA"s@WFLLI>$Ax((Trr?w$D[)@ ɭI?Ep,£htx1*Ą)z*9C?T3kvs:3 [da %y{Njf-~s7){\:>( Z)=1ҡM31qP zDp;Jp u.1+~+_Ѯ[jעG..ӞlL:{ j7? S O{pg.p^l.x J KO;PfoHÒTo$Jq2kQ"0εKccıCþ "+: 5wmIg -a8d:$ 8!SbL Iٖݯz|CQҐ"1 ɜivWUPȇ0% Ӕ52JX,YE6daYN:Iۥ@(fn&PQ[{KvY>@IXvř$#!.eEp"y3q3\r=Gՠ g`_J4R(ך.4.+9gE-We*rR@ =Q(SB:81RLups;+V0-]'|A*jX=-f( { ^^O/6 n>_./`"Ic,qjk%N(^2 Җ)<)}. p|&j2zkx=#=?~dPGQIɂ? e3,vȃ-Raș)ɋY|WP@mɁ= +.C涥]NlZ -ܛzW)sR^*0署՝_dzEw{uI8'n|^cү}&]_l87nFWGP^;V^ՋSuWxq|p +]h|Я ;D&+#&PT!tNq?V.0.E9794]\d[RKUV/2.$|u ?UϠ"?_Mv7mMc}{/ڦRm|j;DYetrsrCkUL8@q^DQ),)qD9"x d9l;YYsP|~'9 тwg4S6xҷØgJ!Ua<Րk`a$KKI  WBeSN0\֖Y sˬSL(S6!Qd: 9s^alCJVY+MbiQ -זD%3:N[&>8;zqWۜRfj36y/[IxfTd uw>z2-ɾjmt57A}-8FH`A<  mljT[՟c&抢'N'P%/E &K+ ? _ܲA=93WVƀe ։ՖȮÌfXB;RAt,7QC _/VNí'`z~+ }G0LQf|Az+Ty'C^P?~y<&(!%2'""1g&X\I~fj8tԼ~2LG`ѨL.aǢ2TTN]=GuűX I<|u|<\"E]ej>tuű$ A}L@0ǃ2z dSWQ]Ty4 UhUCWWJ|,~+)n|=.ZgQ 10 23ۅ"`"ե&w/9 #2rhH\C0yfYd !G̳-h8'z2?p7JQlYAC<u3c-#y`ѸHJwFNIoT2豢S!N%egayW?z7P}߾`$&&G@#5?_мazhCKSW::6s\5q#3us͙hNz9%un;g0B Cן|Elhh+3 C< 4 ϊ mW5Ec&ハ2Bzg(:ʰ·Plݞ󦍱ow"YXk&<6˼wRzN]ʙUK4NH오,ւ㢸JT[DhmKSՑZBԿ ׁ$OsyL.y\bT2a#SXmB uP[/୨г6 e;s1y~ !_) m#69x0b9ĭؐP.j|UW﫪Ho}!EeZ1wAlb6щ8ygcX'-(1_;+(nue`0>x v<c@HŊm9;[{}˵bZEKx 9F-SF vwj!8m<1@A%^=ss,ffm~Fa}ZƄ"#g:'z-;bt&;[Yww:KnȒA7Wr'@P\5%,F"5CbuqyG5zm<1xLB3ccERR$NdIr.<Ş8b>QP%fC|E|A_m]4cZ(;hI{Dwc%:n=|0zL6GluA x:p}:F `vix<Q=BQ99+[1>ZXtD#nERL@+v䤏oW12#XP1g&pMXb[Qf')R%=cq/Afz^[ OP fTC0хˆ+{Z+$,r!c)f  D96XG0y4!a/6Zke ɘV!!S($^2 YobA% %LR#1s:քl1JZ+iŬ"|Mxs> Sq͋ߠ臅X gWzagQZ s#&(YT'~_)tSVOM>|' F, 2x6'pe3ϿE-R;==J6gS@6<"0Jq1YDDir޷(E H}L2tISѠ 4GzYoptVf!/_P|o鯩ra771z]u)&TjQw8| 2|*k{/ڦגm jDYeܴrsr%Kg2D'Hkڃ[E8%Rb%0l\#A* ;L*-zS"ilTGx?i漕ˑF2[lsųn%F>`Bqfo 2񂗗 S Kj0;ɔn#j/4lNNdCֶ dj xQ7m%i-Vv'w鮤7}[>g(Kabr‡P-u[7|/#2ƛInO{LNw0 ݄1%TY yO؛wJ ::iס諚 ݺˍ 7tI{@.O+wǿ8tR\{@z6W`n!܆!Wqy5B`YY؈r!lP^LRYC]/始wT]r0(җ@ZJS5w J5ZHQI0)wiG{9'$8呗&a9ezh֓B29_&z1Y[g$ 7.U%FEt*pN"d|FÍ㖱tiAr%SV:iuU8 O,@ ~wGݔ򦖯)±Cj]/sfR[hO)\jl3|i/Ak1==oH:6$إ314361 !J퉮ڪ6ob($5U2^ĴJݙ[6W}yN=Z1iTy Gm\j*D4''ˠAE),TqGq߀c`bҙVgpbӎ 7FAZƺҍ#\(wpP O=cF@ǁFXp4Ek9 wm\U[O߷ ֐gK/u5m%w-[bĪӡ?X,xc0cn0iNJpS lJM>|' FL''+h|2 Ǒ_R; QĺL s [X%rfX 8rsWthɊX 5zUeV߶4=rgaj6@wxJe4]KO딝UoqBMvJ 3y_|)M5݁+Q* EBo;d4#u89T6/7;x5H+RW@0jU"kQWZ.]]%*i՛QWbǥ^S_Y]GTa:j?*,u%PWUW.=!~E*ըDYQ*Q+ťDzJ&n/;EŲwrv4Kt`0Qn]AG?;~=S]s?v"{r  Ƙcx4=Xo/%|3}bkL1A;Oatʻ"Q.jnROO7KY$d"iY"M70Pՙ:kTL9tQT 6œZ(ڌ-2m|Rԝ1+ 9nW=$fʻW>]8]Rg6J^ }?tz5lJa^iGjd^ wE%Kʶ1EG$j6(<'g"zbTfSz@< Κl4?Y?D9 g `3b&(-i^].v)0`)5E cґa.,Ǡ){"p7"9k4 x$ )[X1c2b=6M 0!- bi/ٖbR xtpAM܌37>o~_̧VsonI^&w/2W׷^eft]Yݻ EۯҭWHZYH$Q (s?Ȁ?..Z"> -g}lHMun4/z7nHƺ$Aa4lov{AwA?ng%:ұ7p0F.~lcM7wk]*$4~{ff~oɧ^Z׹u&+ JKq5O\% _zS6?⽂6gWd\Ydeߏ8ԤX̴Q*c e2+gI1i7:<[~\}ZFҞ\'fCg蒿Dy‹^L~lxlڌT}kSߛKw);VXzk mKYJm* ][tw~dZ5kQ_5Y5$](?aBh^7 `p#r`E("҅p08wՙq5_';|YQpRe?+ÎK܆c+{QzUq-N98]u6lj20"&m"e-0jMپ߿Ox)FΝ 9IIHƜZ@RtkleF͝Qd:~F\O ,HtQDj$HXZ L"AQ)/pvRʃp-1K^gJvI^wbzg1 ƀ1ŢDwHRE1)(d2F97IcOpٵ.&ttpLF AMjKu5-t_4DLh2QӲB QJPaZ*k7dhEYR:[£&@܊`mmKEWg"Yԏ/XA(Ӭ(adNI/+II|'ӧ1x'n=;: ں!""+pmpe,{%!z=]4 QMޠB/e}t@QDh?&)c,C޵Jp Ў2vp >tЫP%l@Sƪڜ]1(W,pZnkT=^pӡhImGhK;m,tM39(k$~2CƠ#RFdC0̹+ c;gQPBЌDYl9icy!X0M`]#53Ko^5ZR\[oEXY' k[ 3$]$/PI'y*0n \n F`ⴱA[9NP~۴\:j[`b+Z`8FL-FTai XeWZ,kHk sQHV(Y5X`4vƸ`o󀕿#}0 |wCPD,T@s1JGlFmeנDKhLT@=B T"* wB)Wz-2'ӫ [QW*ϕ+E3)ķ(]7yM ;bC!Dkz #EiQIUkz y?]- `AE ѳdD).XZ#eBFZcuz@pkOQ˃>t WMIAt-c|ze}\U8/ږj{h7t Be_j;J* YktHXic],gfC;dFkÈP/j@\D/0"#]2󮽧ɌS:(t1 8J*t+W /H46j!$rUj.u/cdiVM6̈́БDSZ,ٻ(`r *\cSYZF+Neg,2BTvW{Yx~~ZH%4Ġ|n-W,Q7>o[ 3~̕hPֈχjP}=^ ׋aʢHH"* 8Rm\ʙDBH$1="@(w[S2I_Ͽ|ijk^lB0_R#c]^EdIw_$P~E~? YuNgez"fq?q,%9`.gW7Mˠ4suvz5_vhɳC_%yB-g6^B-(n(n>d 6nZ|tn{iqKg{2X;#ôExwX=k|,%ljSdL L ̭Κ"t"N9((8(6(&4O^=N+60SZLzx 1,rEY#e-ptv´}a6ZJZ=6i}d|= ؄uؼǢu98s?p..g]3ؤ qrzܽoxd8&?-xglω,Wn)Zy|M9^xgN_Q= :r E-ႷIOu2|pi9'yBk=g @Y @Gj*M#5hԠ4RFjH A#5hԠ4RFjH A#5hԠ4RFjH A#5hԠ4RFjH A#5hԠ4RFjH A#5hԠ4RFjHO=RC2~D#5Hf72R:XGj4+pdo\79F}ҰZTM×|0kCW4m:pUPEh8Ϡ}\eV*x^dY傘k)ϸ{gef\~- {|hգqibDqg䮼<XNF$ٟgeqrO Uɿt-}yW*ĆKxG3RF5֛Io <c Lߙ~Lз@z'fNf\=3L'f:1ӉNtb3L'f:1ӉNtb3L'f:1ӉNtb3L'f:1ӉNtb3L'f:1ӉNtb3L'f:1ӉN``ģ\dmsb<|H}p2Np|)c"vx0ש!Z><㟎Hϵ;[HgXx '19!]·C>lX)j٤󓬤<q9|:K&yJx9=ۭ"v1˖i^VԄ$QnA;*XY&Y٣tܶsfj1QKcf dwٜMΩ;Gcˋfo ގy|t|OkpЀn9gh:깃g^R(ijrE;BLL/6R䫄eO%_2&%'v ~OSa4EviяopȄǾDD~qbKNzbOxzIRbq"S$-vDZ&D׋a?7.*-JÏס1)*W!fLQ@bhJ}{V;{>"A:aaO`m4+4ʶ=>KeyrWVI:i+zurHE;cЦZNJ>+wJh#^^|^͗sZ_Pw~0hD|,\-7e)\  G ~!.z-z(E6&2ªŹN'#jG4k_ Y ˣocy 2&=\>Vsw`:ַfrsAK5_e: }eb`F<5DK'+GCkӃN*ԯPv>n:n{WwXKC6ta >;^2n0Q{7vt1fH>7#In5r(e+tsAR Xm af( KvE$<7 ^9XJ'[fGXW. 2Ϥm}i7ܣ_0煚珷IQ*M R12W/Љ ^h6eC%ѧr?ޝIܝ?~> nN hØ0P|6@nD&Fߐp]:!UWw " *1`KJsZ["U|-~=$m^VKL:< ]}dg3.{oCGzơ 6Vu ( J9Զ!D !-fQh=8RRDib TlU BHִZcԆ *Apl\;h''r\{Fx}9)/~i@ki >{3'}yp]6m~"UF?; .U`U >KJaB+ {f]a]eꛋ__ CH#Iܺ78A*/RUMOOBbzj,մ #IDdQ9:^D'"Y>u@IEˍgn>ugõ3gvlm?ƿ7)w|8UΠ1yee4$ɾ5Ci\r$`fȥ)7E?_xB6 .R g #L\y9v30r_  ʊN[l քTiɥCRM ~2~:>?KTƳ>}L6$C?EveJA`D''[r"|%o> %ͣMßs)==6Ja}L˻/3eju9b>X^xjFJ_`.q /*R}ە@<7)Ce%dJ߾e2}[Y%w>' &,&M>55]Xݽjm=+HhU5T=g?L.M=/@~,g*K!IPwL*&UM6WÙ^]9㷷ξgًߝa^ٻ/@00 .@xo~ B?iYgKzT˧^"|sJrW2ﳟW¥E] ]"KkUm bs3_v:#׆ E|-as]N5K_6 p}2W};}ukMT>1c(QE9iub6FO~2 vgmo"gexPQKHR;pm?{o'E&I78V`@ G` 8.HQ(8R;u%a^!K79SES: DεAy'\beԁ"3P> IYT&SQ_u㡛6`_uk9ּ~=Hx1``讏GLHxa5x+R3Eyjf"|dTyC_+T޶YQkG7T!t!Đ"L=![M?i;B䌂X*gRbrW4wkt$`#]7I_1vY) '&S,dK/QQ6n9=6ŻyC7K?tR>8?O?O~*ͫe?,k[ D (? \qOS:_G'wyQ :fx c7O$u#mRk0|WkT"@P'9-sd?Woz;&?YRP/Pz4Ot\T^&4μEqܻY.Ԝq<Ȁd>m@M>07/:{zራ.G"?wqTW_QH\W \z0Uv g*WbݼƍrO!MDcu!旷J7Nw{uH)兴i!JDx=&BJa.kX&:$h5I6\N^(7 jOn1vABş4+?G cNvCwGU^ %|w \{p}^_Wz}^_fXB B+ + B+-`c}j^nPl4x^rÜ9Q [cI"!t5׸I'5v$5R1G mw:2,"1rbrX+B"{o_ozo7ӯowb7/mowti$//a}t( ͇ަݦTY6ݲؤFeӡ+rodO{0 2>g\ $ < ' XrbJϧָj:{qW#{8ݿ6]7*uۍRp.= `Af;|WCJLvOK=Tn-{o_ayQ67n8{k^{rpB PЦTѦiw73= ( ;$xj<]@`B<Uķ S띱Vc&ye4zl5ͭrwb3 N~>ٙp|}Y*$-K{sDpj;f ='RcNI{0v! bXE*qCvYI(J1,,VJ\,;g;iYT4U3MOϥegtr`.[\6cl(NP̢ww*[S.Ƕ̔IQv˦RuѩTzö:B})n@ ()& zK 飶ěU`̙ ]2m<,2+ EDzAeaQzz*E#kf驺5{g_)YM ]0_LSZv}=6rHxv49?ny-'"A6S'E;4uIi[- {cʏ.-ShL1"h<)$ Å܍E* FכHFǤ"9̰4N+ `,\~, *vI첋t2̺Ƭ#K __ CH#Iܺ78A*/&A8"jӓK5H;bqNb‹Ic)f"FVEG( X!zFꖑS;\!XCNmF!c{qW$·#_Z$Ka9lYb$7q}H7%f( ߞLQ #ƬKOHvxن3E DV @`)"+48`(¥- tWÀ05!rTvpSD_Bqىcۉg&Yy>ȮL)HX̞֟zRNX#gbv ht46 AW\Dڔ3-Ϊ۳>_byqv)m4q4?L.M=/@~,g*K!IPwL*&UM6WÙ^]9㷷+mH藵:ˀ<6,Ƴ`1F#On6QIOdEH*"KYQY_~~}ï(w~[L*`Vضu"Zv㿗+8<2C)%cyXk#=EpTuHfѬ gf4a-0E?k+ \x4lofc Q%@պ rQIET|"O/&-L'u$}hkM'6|&yLtOZ#UI~qe+'=9"8ṏFgxa7\TY?cZ|qjH6ŠV>͍{}0c9xn4!$HPތ@&(狫NBek E u;Tŀ @GOU/,N=Uh_sZpM^p>A;k҂xV"M.LR2id 1T.PDɉ*X>ZƧ=$h/r;Ҳ0Z񈰆C2"j E͙"xtx'5M(/sD%*+伨COGgDfFzcvSxkHZ< V767̫}9WWxujrq;El,WVx嬯ˮg+?^yH-V^r=?^_Yo,gކlD2I\YR+g UR Ø@xiS1GF! A:/…sa=76tJdLgSG'LIExb]K|ez)&gug9ujU'(;(Sš-WcKf@:FuЗYǙZMO}qҘ2 gkX{(8^;6t_1N\Ř*\T0m\$ڇPb28sOUSxc(?Z?FחjY_ZЈMQ]73x3?M>Ss3i׬0eۢp^+6qd6-b*-֙.IA_4|y,+E%?qҁo >$[l_?vu.K07Tjǧ{_*Ic,o5ELhR /EiK.di2+Dfm~r6h/&=mb60x¯nFߒbէOw݁XV DyeʥH ܓx@{g =VNOe ؘb Sp'SeEO^W 1*O5X-IT\[\oSs5]VǬ]GqEL)|TM{UZ8A2؛"E*RHxb*'!r*bo~[y>G6JP Tۚ54rg2844BhS( >IZtS-[cקɟݶ|*(۶8}r H"8Xr(;&p_tQ+Mbi_rJK뀱 Vkp5?ioS޸#-P+n<Ղw() F*mHL+A4*bDA A$(tst:Pro#{YdɹvlW;upB.!6d7#E% ^[`v"m53/VexxbUFjqWn5NztCqzXa/8?wʅjC&Zmq$fK&u^5%SkةA*eg3EmQrE8zȅ#Q{Pu2*I+uSPT+$Xju(h*N^]ќ始:Gue8[(wN>8Osqq 1,_i;-Ze:0Ϟ`р[w204)a|5_ Û467a )&( !gH"(Sp"]1A%=:?`TޜH.'З=i#Jy6'#uŅdF]!\+9uu43TW G 6F]$]ejN|EX\UO 7#W:Oa.BJWRbn۵x-DݦeDnEñT>B[8˴ܛȨ~^7<,ZbBZ+gZ2t=9W1HL4EuP )K$)P5rLv_Gsp:K7/S״}Ж K:dAx=5$- 2Tpr:m(ÈH2> 2O}rtZ-W}茜"/szz\U>[ODg|S+e+z/ ĺ),Z|}fuwɻ&-$-/9nJ޿gR-;_mx(2"ԽJ66(G@EQjBHΑ^.)C}vcƠSbTrKPIu;#gf\R ]uX gY-/* o/[?nToʎFwXcik!60ceɥ *GI L#mFcwFfa<w֖EkۅJB*xEF \ϣB5K1D ݨGkT"cK"!pjT,eAry QFI{4(nKM"iE)i=żDKzC}?4*j ܋Ig-doRrVDtK4q:dߎfx<@+ѣ$(HB8e^fd։|mw]EР>&x?d w12⭐JjED2VC 'ITK B<fs"&S-sP)c$pPqh<8v(Cd(<]PezМ?i/o3%C_ъ;__07[<- ~c~ ۖdE{|mb$/7TQghr4 @'k43`2d6BK u6VV)%x4SUR oTDDFGE_lq/džrrشuGa}"۰g,(@E[HbA/eXLyyIQ 5eQHa[(_?}CM^& ,E9T dǛM8zɯ7].oB҄~~FTGvݯ'o^]/WmaBK}3[}Qy/:w/)-o"P7]׬NwQO {0bZ'їgw/mnkȶ5Z2uY:𵎤8}1T,گ0Hg9LS ;VT-8'~/+ˏvxu_~_y}᷷_}=뿿~%E"!WOvD^q:|Mx9 b95Ȅ1ؽlR; .lfWy;k| >M~4`!%Ѭ k py\BaÝZ 읦.ZHMѴK,JrB\D+iH FM8C=!M=^d ~?~ӸLefrVc"_]6mHX=mLܞU]VtZ-rAjIq; Y %f(!!pW{ĬD L#ԥb/ */(}Yp*[={w㣟de?ɖO_\y9S5*bekL0l[.fwO[NO9&f:Уdz)! 斩s#up0i硋PB3OiJK (,2kQɼ ̬{q29eiMddM{'w~,`i "|[thgJyc2<1ɝ"þf؇q&Rj)OF]?{ (((R :|#;Htq3vd`$ᣅvG=޲fӌxLv#Erky>?..c% xl,s@D2>X'ɞQ S-Nn0[$9%R,( q^"8iI<r .ntm>]MvۢGʪw횆a}s:ecn}<~} y $M ڛTI\%J7J?TIJs*'JWI!SsV\%qYo2'i%=us0o\^> #^c\O\\OZ*\OJN\{+C)RQwaؙ͂[=Nzǎ*o!@G?&ca}څ){v 8^!1im2uC#PNl 4X8w=14g!Yr(g!Yr =d,De,D9 QB;9g!Yr6(g!z&g!Yr9 QB3g!Yr(g!Yr(g!Yr(g!Yr(g!YCB,D!=g!YarJ\/zVȤ S=L6;/'c0?.ʱІ<6P9PNB NJ/TʿU\O[JWg<;L^[wJKXnJ6 k]Zreǧ{|vkB,i>#FWjU0 -k$r uh̲FQdZBv?-~[8F&$P ٱWGj( Zpsc@aGJcUD曻dt|ov@ڶ]z޼ub=IɢB:mRȑ YY4rGZ!Or!D钤 "ΈRR抶pkUJs5uwGs,&VƝ|odubͺtђ>@zgHf~? #:׾͋em),|ylW"نl̓j: gĖ1Ew1Iۤ?$k4d>6#Tn#}U4=*䧀*P 0`Cm{7IYc-k$!LӖ( R"h2E)) ߾osek[W?|+lsop!`Ԕ1WXi3F.[ע-=xyS|tS>Rl,:qQ8!G@9ᔕ@T<5©j7ck 'SڍӉsg2\Ke*N%6 N7SZe 7;qE뻕_?Pny%'FlG 2@L)eVqe8?aTI`"67< TC\d{8˰ɕJx%f6161aAX$ b7&j^s.w]Amբv`nK9G?U%"gBnr`Ң_Q̒FĿ:v:gh'o_y6ɿ'}+uAyX7 OYkXߤkn]+[~߻xfqc 92d?\~{Ջo:ÐC&4řyof6{ipnE"N~v^yEgXL.҆aM~~%3籂aD`Ѻ4[zGyZ1Hmg:US*VQ5p18!_Ec?$g$<e@yU~($GG /\1C#Rsh3Jcuzude5ڥ;F?Mؐ$(Dd]bt:T)((k7لbW˜1i4yރ 33ऽ<~{f3i~\i'6I)l'$֕d7\Rxdu~j&4?UQtbbgw U@n?13~zXgVT-/&ޛUBbq콎_KOW3kQ7~4[:`&w:jft9 kl<׿/ꚑ~vЧ/nz90ᗇZcqT\d?=/8xB)J)OGl}`!Ri1D+R ַolM!?P7@ 9H܃ Ec!Tm k*vPNf%;zk{KS{<ڇ/sC EN_1^R % .M?ŗ+'Z2ÐZ&\[N)l9{3ɚ> bm=6dkN>O jn=P+4z,J[s7J(zoz pbʪ2 \*[z$$rNNzmpey1Zq@ZE!qK͙2LN~ M8>NR4Zhe,ν5L|.KMqSx]x^r̄\*0׋bE 7ފ,¹fxb2[8Ot~zypEgVڬ'K|rmfV˳W7>ޕ_'à_;ck%6<Ǎhx\~޶~eo$wjן\K`%Ŗ~WV4jl WI lr/p;WYJճ+NNS}bzMz9ơpJq~NIpx-jL20޷b'SN(JKZky|3/o;Qng8o ߩ<7Q;! K-"@ft,< (eGWy2 >(v6M .F8[QxkeŒ:2f6bSZ<$]W5pSdHʡC{A岸 >RJRgH8Gד\˽,0WYǶ*K9wW3KW(&lo*+վU]+3+ɍ?w\eqBi5%WYJZzps  \eq*p4g WHgLN$FS}ᷟw! 4RK+KucHXspL$Ls)2)+\YmlN!#A;H@ zq0>~L6QZ]Cfp yKgz|wuY%0bz |qG=;To:^q#7V/#:QG:*hQӳ ѻ>1^ %0߮\w%ayĹ i!!GpGm`*Qf@M$*\"e4~UȮWɤnlK˳]?46?gׅƦٽh̎5sr2}p@6Gdg2PF,7J5%Q T68㻑dy-Oj6{br}V'(ah{@/d+O…dQS ЯeΔBpyB]xyS|tS>Rl,:qQ8!G@9ᔕ@T<5©j?\Ӛvb?bN3S,tIS 6^tliPQPKJ&V]Mz+6.CKQ[uz3j =*z'Q]jO#6rD0GTG8&7uQh4X*q~b8MϸQFI{)WzyGAGKĺ {J@Y!| I8= P/ 0\&4j4:c;Ms:l?hچRBcp )}Y}6+?sqse}{"=wB}\veeFb1qNkrS<(Cqm}ۋ)z\fw^ aT .# l _*0z(G7r^L ~<fr#lByt\n:|('U#6n;?la/KږdK/gM+9ыwKJG^LŹj,9s,ʹzh Ȋ01~JU6ջ3MGOv_;oL^-*gQVݖK`>0 W2s "@7fҫ+;o _rkVbXծ>-CavEdy pT(Y/.O6=絫RyC֭Z+'ojFqza}AIoTrRQfTNVMV{}_xW|?o{Pzͷ u;0y e{poqycKz[,-iO=uסM-e\Ȍxs0>}lg3wҫ*0#6ǪtͲvC&_Dn}u]d!t:v!Y,/m.:nf ֦9n>o.qL*%pF'$vLRMBEkpQ\%["ԥΨ;wd QIL׉'{b'9|<$NU+v=F>]M#S;>qQڸQkt.jm8bkCP=]CJ~'6HA2'2 #ƘC QlM 9W]A5_Osc K4.>F8DsՆV8kքx]Etb.*N/ q e0z =Quzƒ @Z#V_Wqו Bmp7PU&0.˯϶X3G\>*^}n~~>k 7XvsQc0*|_ Tjt5vY}a)B!<,68pfhOyrQyp>> %*4׀ @ԈGːc"I0UW4"t[C᭳;gǚx6X`D\X3b4bRڄrDr_}y7 a$jOi{|7ɧL[̼0r3>G nN{mp~6VK9b]NØg+O!'tǟ 68bcڒߎ dv)]Znt֠^@B2x$$5 'dֈs,9"*V=vu+NFVQMX-Wɸ 4md{cczcՊN[1{i5!!$sd|=hB3A /%) dkMA/4\+IoihOޱS bdZ^yвQi 3:C+0=?[Rh(6S87Ღ!N@NN;:rtzݒKYlT}.+[ȁ6ua(feJj">t(;-A?ϻi2t93+/Zjﺹ`tyy+${󇦱~('W7tcy~:W1:>ιˮ+3vUٕٸͧ , >N8v;ښ,W*;K9EYettstsRg@wJ t?-Gy#iPQN.pj].K5fGOG+uG5tuĄő1=\^L5(jBdQPM}0Vِme?ܘV)qxv+>:pתҌa{Sƀn`Gtçcxtˣ.x'!̋ tA5٠ InQt> tRTh2-~ٲXlGN҂@4t׮b*xb&csL*oC|π.+'n5h. }eI9~rB^BPs}~U`#WhO ) $ >< ۛ~G/litg`y8.xsgqA]CYI H<$,[g]?eً*i,\(cBo"0)$3*wtrqJ>6y-D#ܻ+"ޔ=׋Oե/Kܫ&1?-0{Q6|fr3o| ^}2oƶd79ŪcW.486nsPǣO6ǽ?g ͤ:ZIF0*FB}`tQ1zZ9<:NiAåLrzȎ0E 2ұlGXsJS5Q;5Q #68& XZuHZ\`P+4zpP6Ehm?18Z*۶Vp ibk51#V5q\ 2N=n3CItm[zPpfyv$cWϑ]I;kvvn{;̾4sب-Yso4Nzs/`}q17 x>Kιh2GwK;h &,{ŗTct`bAR,j~3 9h3Q#5"Y2L \ }0;=,.92'2 #ƘC h}Д7{X4 t0KM.3zwAjx,*>MY4\b %E -&ǬE ] e T?Gv]ְ maW$U]=Cv%M*ekU{+q*CIUǮ!/]F؛/,#? wۢq]zJ171Ҙ18k)h]pNȳBgZHʅ`' Lq=Ǟ[Ej:lcKy (~F&TY[fQB6Tae 7 'bN9b ׊ٯ9{m'.・{ dW,+et2b]XFW,+et2bkS4cj*)i5,rU #.0`Nl =,pXC'Ř"e[>w _Ґ"6 YivTU է QIH>G/'rn'ɹ}rn'9(""gY.r"gY.reEr\D)e>wzi.^->a[0B HC K) ҲS%΍Sxs2=){Ω,odWMHlYm79vWI>|9L2>Ku=eզֆ&Nj}:͵%3dqm#%Hrl37["u V0EA3XRp2ƳW|6F9&r6ȹQjEw9lVJHCƌV-Cy:Epm;JkKw8ݗhvÇc:(IMsc4S2& ŘZs]A)4Z Q9IpsGȨaK(uV@_SI7$=9j^Zu&g̾XߟVZʄ2R*M>jb w MX5u[mzMr|t8Oxz í[n:gwyӂia\T`U0CL )G*dp`<2-tN wL=gh:6RNq,z̝vX F%{gqo$9A [T)YCkQ&m%(~>o1= ȓN8 u ePJ<{Џ9B*7xT=ӔO8'Ole&#QHYA z0k5f,`ZFL&Z ihuqjf&ɧ6E}9 k{sDp VX^Y+0EωԘSAR0\H☁ 3 t ('H @GQpXG iYFn}X OE#Ոe(Fq3-R[ a")g^(ofH\DDŽ@C8g-HQ+ l: ĤgN X҄Ij@^]fѲFluWmA/E-\lKuzQeV# tgxx2ζT~>< U@yWNMF6I'.#*.G+VL49*`MϦ &1>*Z*O fr1Dl`J}"RP\}lؒ#M &["*UU堹ҙ[]>a+x>I)jC ^Fwv/7κEw"h8%Z8fLE2Ę  ׿-S 0|3nꟙwkRzCg-` B3XF ]`}P4AX&W:>5k iHÀѶ k(<#\XAaS"E*l#fYA[|UE<ҫ]׫]pNVTB+Ff;J^?Ȁ위?.w9(wt:k#)DnͪChY@&!n{=ֶG -x8_sw{8*,g 7Ӕ={婷fM}O(qӑ`?هd@,l:79Rn:?>VQM!؆!<"[0Jq1EDDirgC:һpC}t)ݢ:~etk'Qg*褏M5*w1} ɵ 2j|W\ۏ"X`T9F$㑅H>HԔ1т$Q4`pHnZ: P7(ݸ74#ч^ʡrķlI]aMϥ Z |˲d9IE*`2DO"k/%%UU"! %K9 Q Ƒ6&-Y@Vs p/.tJnOf`C >nxǢ)Yf<w53 e%t\]>`e8PleiKz줥tQ_ {v7DڶL׍w7pNI˭{XjEL|3_Œ)=) LB|,nYnr؁aѸkf$ՁmM4ֆ +mpYgLWS67w{5]|5 DD$z54y\av|3(;ޖ'-F\>}joE',EO9^_)x~r!_B.8}m5Y"WgX&]U~n#h\E  QhP!R^R*Ap:1ki5b酩xrr=TS>ÖuUz֫NNSc&(|Y z /߯^/M^5{~?{2+Py;#W#?~(j!YG%Cs{(L@+@Y!ZZqxr^|WPAT('w/5%FW7]\NW;$Ϧf{$=ѩ^v~^Z;_w~v{fE5w;6|gr;SJ=>vjynpcnx;:?wj_/Wԕ lU"Wwm Q2^' `/ GĘ"-?QDZcvT>]]u,pUuʕW|WOcg{ <3\O\i'WRÂ+~\=Փk+3?(]&~~X0fo-Gϓ{!6UXxՋI}xV\{܀d^DM/C~o~,۟j/yA;w_%0J0*WN3])(dc qDgxߌI:},R ͟/pNˣ+ \q{u~Z~pER*;zp%3xUWc"RWEJK+!lUK`UWc"C"%•fF\OGUVC"E•lNW/@wj.z_?sܭO3:L1Pji _IpL y/*!$CRJӫuf\)` ~K+c'F@&3Ȓq2&OE" $&@GXWk?3R)fy j[zY)|9~B.*52 l.JxJF㎃]4^PAVl#UzwcUw& [:8RQYt $$  QqZ_>P?f'̢6ΖlQpҹ'uCҢ 9g.I՝-Ζ}gC({sd.{6Zm牫"tAHEBHAvJ0LZv*Nvzlz~8ܵ;W/oNJNO}ͥA莢r& >B/<0.j?}R]<؎DZ8(Z/_0"Ÿ_tt=-Q,,r–ːq"\}IT&Ym5ule{oWbUu7ӭ̲}wg%4Pgdc N%Ť `9#( tpYicy 9%"QQ $G(Ubj4޷k~Q bkPD4-#CMkQ iv@x@XT:{=3BDgVYΘ6@1*emLD.8RPFicyǻ髆rEI]OQ1c|ttxxee`N93^n V$JB:}PD+E볺R(Fa9@4XusC#IZ2 hupAÍ8-A_h0G!Q4!"!X&f"FQs\-C|gbDgO552:eJ:G(>slRȒ"䠼ֺHl״olvܰA6g’w }OhA ͖Z7Wʊ)N{Zf#!I<_Q>~(70y]`&|uweq^ tG~5LNNwQ><}{Kp0Гd'P{:oF4w#]c7b(`bGBѓ>馇hUͽ.צg%%ƫ{M\ұ"h\׺~^_{Oxo4cuJ jf˽ؿ8#sۯ?{=ۿ}7440HIGG #?{Zߢkiah :3ڱW~fObIZDPWgY[ 6_!ih^7Ѥ +| ~k穆KaDi]xk2-}l~d F9-`cX6?@ vW6ƾgZX̩Ĭ*kMs2?9+LN+e0aoDH#XAi'Aq1`2+kszudU=? "vDr.f̀"'#W3d#$%vdiDimۄjejˋCcm&JcN0[Qƹo %s 7׆CIPpaN~2I?qGhji_+\jϚvVqz"TRވ}J< RB-qӝ $%8m<1Pn$IM fx*^XKHQ'Zy3#&LQ(:JHJ0b+Po]y&V5)rZkp>=>v2t`|z-#i|5WYِ u-&cnuo!75J?6pBgWiyhḂP5&{ o91^x֌?< LZ %!2'A̅ƒ@o%r\%oJ%,mi=YͳB>_ V@r6 -ǒ eN"I+0>L d)mjÎ#ɑ[cCw<q̈R6ԧ\:6BW+筭Z_ B#g-:(oU?IHnyH۽zѱkz/Xzbb5fg^IWL[iEG>g8u7b bs(_B $K3 h ^xձ/^ >{blidl"kJ;ƳXVB>5|cQk02N*`HcliLeu  ^v(ÈѦ@D5ޖ8JRRdٚ <Ӏt[`&+h5UAZea ;ːq"\}IO L t˩Zgѡ;[N㧭?ZY||޽ ^ Q(~:Uu8}ZZ`''$PI/wW77ON4gK)~جͺA;Ewovz|}5"why0693ba-7uW4'v㋪\{nuoSeX.w,IZo^`HX~1KC3>L:s4=Omΰs" %*]Yo+7+¼tmF &篆FjR?ap081 B@(LE9֡z{^d"DϨ;|bk@. ֆC HcN)&ܙwF_Ú-cKߩch!`^2BeibﰡDqTBTCi]QPV@35 _0MV ϬQ"馇pЌj;qd+bW/Ͳz#Px޺8ӛpxawe}SsoZC>Á\翮CܷnP,9kkoR;K9GYeo ߜ%к"9QYeDfo;TiU؞Ҿ?8{ _/+]8s g,y:\ 0ő1Tx-/+rJ-j'66Iqg7"^)wNRn^$~%V8\}]`~o*cC8j3xݙhIؘy4KhSlPr n;1Lj2QIq;iؙvBe/Z]_oV<]2PԤ8y39N2F?_>Vv0L?8dR\ HQ=u M1cKr GQ*EQιq$>hbPwRG.y{1qE\ 9﹖\TK^Lq![% J$*(Q2?Js5y~dKB[Fʺ`flb@)J$̌BBBk-3zY%* ,]8pZ<(4ԡO/)uki}Zg[Ix҆M&NP'r"J-jӓMf0)A] ЀPT`{ʤ%@t9b&抠{I>N3x)g1%T -gnY^tr1 pYXăde^iQf;bLͨgGD;Ͻ=az+T7E >M/>`2?"NOX=hګbkCܟt(?!q ELpݖ7knby]*).]?`m%`f(jp4'>*N9R0}氆,mPPQ`NVjVqQӒ ˵31c82S8 OeB]WTERsپ&"3i2.C "ί+ZtoL[^pYuRޔcɁהL"x 3x%y1Y?iCsG҂f7xIMF|u*AС|xƼSD̤&Yᆤcfi'/TZPi9IA_V5)'cL Ow9-W"޻Ggghڻ%fĨoQQW7YiB7YiR6Yi4YaBAϫ ޡ .MV~ EVul5t2Z-cP[3l87^:ާcM,\N 埠"9d0$HK1=Ҟ kU|?jɽګ7S).>p+\#(]: F,P{+tB #>X`{8zgS{.]s U7>&$LB+ .-fڦު@"y$wVjDI fa6O.HZ!(4Ԇ kB/zuki i}/g[yՆD>"F8Ǫc0 !CCPDԂ'`y =R69IKm6DSW$Kqǫ pSKݒEg+-C#%XaU?N^V&U66lc-{"[Kpv[vF:7a5 23,x*F#PÌ3lOMxY 5mpK;.L;-.ҞS04zABT+Q:y:=&3$bc~gX5[{Oc*)o|o8NZF7%5+Uɕ]b<Z0c%N+l9W:.z FB'!j?jHY(2' '1 FK3\K)`FHCg{K]XLjh|CI}6=+NU2q`jW h%\7Z5|$?h{nJ0&`&kKs=Z?^-y#[1&kH)D{xA^14 /W~Z[r52HGPqÄB%Prsܐ\4-!|w~k]OFk婻Ͻ{] .ŝk6^%nn_?sDyx9S6~4rumT! (_{K4Wߞ܌]Յo>-׋"گ|OQ Hj>N+g[=oЏDh3}iOCT4cdDGJFL4x4|n6z2`(mͳ66k۳F |I: Y$lqW%5Bq}h"ǪA&~=;~ww~?~a_?}oS܁4e 7/_oZ&oX?OMMk&vO=yב Uu\EȄӷ#Ħ&E:I?/+Z~kK?|m!qѲ_BDb?  vKJjyh#-,b8*A0lN0^o綃?jpR3/=%:h=SOnW?!|l.PaT lF0{84cdLx"Xj ՈI Ũ;wb='I3OH+ OcQ$r4!OV$VA F"X.-*!Y%gO(Wz,KIDֻzWaHW}?ONWp*TbՊ])K)(%]/[S.!#5! #I/U&wp0ܗEZJ(QKR^AUE)z8g̱aZ<]tUT%$AE@O1[ NbAPb* X}=9M:}m]~`;XZ;Tdj֭N9ji埡c+yAauB9)IՒM{˥l)K^؂ KP%EMAlY$ /St0A`+\ ȶV'^-Ď<=!Ŝ8wi?J7_h14uā9Ħx8~gob~bڍj7O$CHDQHYgCRl4(MM<$D󠼭(ztw/=)fd˓-+{s};7o4ݣ4IJSZ՗4-'(IIi0)ԄbJќ^`7mi,\icxGM>'1QQi  `QNcQ!8uB5,\mpuWv]_aSlG[wqWn;zD-s/%{+5+d2բu=o& b0z\bO5l d[R0YD`0y -wO 94Ju[/ؿ]h!N2h:{mY~_.`"փg VB*(Q оr6Sr|?\;.A c*}ȅ}f$h77ʯǂYIl+@!q9F/໼BC4[T|w(IĈcY$M(1{Ai<L sXޒQ"'K+Fۢb3Wb^k;ǭn?v"oyaz%bbcv:m3haBջ#~O ̏x~,.2#@9m`QJC GE\tndѮ I0"6:Y# :")i7MWlMQ(0dnI)HdH\$K=`]P3 y ݥd~ۄB/XK9$F_ƾZTFIrڪz*GBi`HհÒ l ַYJ٫(cJ؉Ԉ m`9eEm ӊ?No,͓&I=sqՕVɋ[cmZs#N>?N=er6 ;-ry=^'|/+2՟헚&i{Y}):i"cZ SȦ]fJJ%Q)[V4Krl. *.'+RImfl ؞%/B2 _ Wjf2~LL~W8Mnt3vv9D5,5C)Y"89he=[*8;Э2F]:{>)CPZUE Brj)y4@\D\kv58O3v5+ݚttڃh\/'ASm!#xˤE:-; Wu>4CfP{Qg\FT*+cq$";y2i>;Q2*1v[2k#>&@. t ibT#dfl`i Լy:U*τ"BJ3Ȟ=kT!58tY*W:["̋8hYW2,Ųb )!cH`φ@ K]NRMx5x3mIGW^Ƈ@a@l||r6aglv?h#)nrv:Z$P8;ƟWy 2xxh,)_QF;ǣ|]2DM( 5o37ĈQ߹ H'\f<7.?Ӌ m:XRݾo}A[,Js[4Ƃlr1*[I,*$ }ӬP{vKBM0Lƶ .6C/./=N䌌!pfՀ!gʛL y9E)x?rw ߾cxpjmj//G!+`zR%їŒJCH= )g}bFU{CW]@W{HWA#*\כ ։UE 8ҕ x,j x2b3 >6\WM@5زzͅœ|xg_+iK6dL67(rn~cP*ل>7sQMI+^\bqU/Gooݖ;)5\}oirWt4nnp%關^Ѽ:妻^ }LOr-;o{?|*CKII`(c$_Q9bm`8$V ^ L47):Nj^gmStb\cEK ){6iPHEVyP٣Β("6N/}^~M6'w4-~Y{_ r2~xzT:R[<^ӋG,y/T@@1Pw؄7>ֽ:9R,i)LYMhPVˋ-5nnfާrlN&*).,ӓ'l5 gZ|\p/ƠaqzmJvxրޭXx斸V7ss?Z)/'xQIx9싰/5:\aueߘ"3SrA'ŔEV*w(<&ꚍ/gJ1F>i:y2;e/t~ 8p͖:ۭ\yMўy;pԻ(Ic%A EuGTԒG%j-`gJnhl8}-$gd;sI+R :`]$r"Xvuaj0Z3l4IRn/Y#I+n~? 䇬, @kS¡,W=|4$M5E~qZ=ͪ﫪6XI4 ::Óa`F#LNjMRꡄ+z Iˊx;eVMU4Q4"iY`4qLHqD-s3';m+ݍz_]}VyAI"0YRHL' * "e(|fR̅vQN=` ۏ$d_~Nл:o .}tGؤjT&_B>J$r ~?#/R촑86&Vcn+qi7{i7^H͞syJyj'[C87BN‰_R4fy8x"w)i\0g\omVPו2ŹSEj  $iuea9MNk$*gJ::.R) ΒVXMg/PFXPY;0e_]R-4o=ZĦ.VG`N._+'], ;ݍtHp\3w}Ng]dw [6%(BhhѨwߟ\͆;W ֛vyy9> tWُp%!:!*(QD@N3/*^y V 5,ed+]8P = EӸȍß]۾w~4s'@n=s/p.=RsPnk{i4S!aE-ϫE\jJ [A@=Cr gBDw|fnFjL* )T1"@ft/x&Be*ɇE|r<|@JѐxHN f ~qu,>݊A,(!P2xLy&erít^]AEo3?mI\O\7/࿞eB$(̱(FpFq‚)~γ'NJ? -j劀r W?_ң0UЕ/~<_MH F:)P 6PV%Y>f}N1 dǜ.ˑu!1{X@x0l➛JNzaA1mr]`~(㊈d ԵzKb9G]ZcK7(ih9nIgہ?LS5[ASξ :~$ΥPq_S Y>1)K0WpDY'|^E(zBgIRrxh %NLṺ!84`7lCCZbh[ %ՉKY*V+||tv n4٧tD/W|`sJF@^DK"xiːὢL2F4*6W," hˈU$2<_ 4$I$7FPCjML(ŦC<Ka$ǓQLN_QVǢOZ"ٹLI6k__;29i\3l2sn9M$TOq并9h{ ɉw(.4Y ;N'De%cLޓ|FGBv\ EѤ$R)\B#U2vX$c_[ BƒŌM22n9;}7^yXɠn_i Aݭ/lA' GZ҂Xo`$82dD"T6QLx \ȐHjHYOT&ֆIȅnBAxgKPaո-M994/65r~Br>lZVztw~큄Ơ@.X*8+hMvpȴI3".MoK/BH/:~fPMF{\)mA'mp"!MFe͟>W`ڃNYWuE-Qw>X2H<:˂hpMT20$ uEIKmM(A%AR

bu~zCo , ˯F U;h88_aMhͣ_mԶwm*;-Fs :(>CFz#~I/ho4 RٱS̆N?ǽпFqo?}o}Ky÷ W`!92E賧˳&3&h}CbCx; Xɷ-O5IJ{\#Ȝta_Eql09{Mq@˸aQnuu/̺h[ J~-q%nbcB b@_xQ$gxqޖ9F&^u̖T5J'Hl 9f 4a -0$DˊTJhH9pVUT嫭]BL!*}Q ҂uņz1"|4X qZ0WQ1 B2*ZFwחE쾀;LfՇ?%Ѯ\xq(~Fԡ{J{..s8wCBwwlF W'?ޏ? pCHRp!T :x$Шws b bU*[y"ʼn..?K8/;%{ͥ|D@@"Yꢱ9ѥT^R%g:H2:#ehBw.,} 9O/U>3"m[»w]$7Nwn]Bν3N1^;Dn?7w9thbWo^ͦClYayݻݴz~hY=/,hC:>.2ӵe+9|Kʹr0o7\Mw=>PjK?/t:n~$nna*i|sBنJʫI*qI; B:c*!Dn\啐H|z6gY?7%u%-ۑ_mlg9J}hrq?f t@VX#JG8!?+i$JD5JK W'Z*xZ&\Bm4ѯ WƅkTYc,12K$TTe!F#u`K94gB'j]nǣ~_U&Q?{׶Hrd]?vXk,`X%Ri6%Y7xiRQ$J04XLVʼn<(V颠HbPd^*a ވu23 ǫC:oa!{ӵB,.nn,{ѧqc~y/.|8g%^XJ9\002@\%b~X\@ V -3#D퇨ړS8^d ]Y>a)d/Nv؀K4\6Z^pQ*1.5}&µPnkQQ,Q5RmߘA+t]tEJgP _?@ A+r!ZG7KZ56T1]V߃j\v(~>0lo]]$|RҴ^_^VgvCMp|wds믕s!o/Hn2ko?e[ڻۍ2UM[{U82v;S28,}؃!^3Q1Y~y8 e2A&rʬ xa`( ![?1 WiNO )u|WqA6m[I/dJ#D>t| R8wO<d,'XRݘte;-O<񐖴+݁d9y?.X;b 3YRBqI6Tژ ZD'}}m H.Z/vOP!>#>qW1H"O#>!gKG}fCm6$G}DrH}$G}$G}$G}e62ܾ":`^xpZh7dF*WGiH-U(fH _T">#>#>I5I}ƒxxdZ&>!#>#nHH#pI}$G$>#>#>#>#>#>#>#>#>#>w#>Qoѣnlc۽H\wmY7&:*t19"p{}NT#cѽ;r,û ZgSӞio IVsfg.Sf^g:wem(`*̚Jsxg[{o?]BISB31YB.%tH:*V; R2һ M81=kp ȧw]Á>ZPMI %]^ մZdFyx10<<7IzC'ڥ$a|eq*}f26KՀMnn_|oE|6KTן0rF]R'Зw*1yL䨁%+s zӞ{W7-7J2R0SE.DmU8@)cSll)I˛Hg)7I#ƄY*KSDb0 cL䝷`d&U lBU(8p\.zƼV@uc6jU0Vr8h63CG/3S _ Ձ^az޸[+N9" ^~GPӮ@ϗ_N v8D G]bi*6) lh>r!wnW`ETI p .d-% zc3N lSVB 35wbl(JxZϤIIt S ]]:,xg͆'z\6&W}C ΀ `\dD0!2y..> c8;1(ZBˏbKG~i^Jhrq,qאdBfVRDy^5512,; phi9B8F~&^:~:g6^XǒI E*dJnj/LYJ ֮"c| ]kd"ǂ5ߪjxJ ?qmY1>ja\/1~ey=a ƺt}m,hf!Ǣ7FE:0Xn~x!-d,$B {Y>O+z2|#;<~8|ދsq׷w+"cn?1~R0+eA`6jz~Idrx{߇Vuqvu6Y,06Aَj;_;l&9 :~]~WÂaݽ&.Le )ݚ{𔙾Pf=>$ώa7k ~E6ߞL )^=}`}[_FMsvQ*- 0FE:qt-S2#dr)Ž$ܜGnA_kܶf7GZJa^(1,d'Auy5v?/ARq TSو·: )ΙRSrS'>ȴ1k\ , "碰h aڹv/A̝KN9ุN>q(~3 J *ldi:l8{Dm }Ydgku\n.1ówi.3EPBWSS⡋ULބ-(uj蕩XW !cԆ׷E|D+EںJf(q]20&+/!vod[AQYpEѷ0)REa9{[,h*+Zptc lFR7j*{!̳N.%aļ sb4\hŬE`K#Jl TPz?@W?MH<ItNq1dz͆m/‰a)V~pmT,( ,~7djȕT׺ wɻz/v_חL{`emW~zs[%{MЪCI9YR Ȕt[Xɚs.1dnplEb R-,rJQB6Vj i.Vi [YƱ4|I}zD}6iΤy·o.L_W/ƞc":"D1xVg<%WRщM昜7L6زcm^ȂmJcQѨl϶`nǜ/ByPޔc7;^F9: ym`wRA&1O5ŋ,L:DT`Y3bd))ӶW!E #C0@k0JF!`> _1Y¤lh͝l8O>_fT8>G#xmt.(!U+`Bяu>&M ("a>#mjgX1h%/rcr n%0ʀxMuOy릚Vy#~#yטjͬXEG~ΊF+%mkdLuH)JWm]Gńs&_<_Muɰ~h6O _q6=e%wʴ~\An/[=1 %kQS!NHDF[rt-`xN}8nu}YC+܃p !h >k X X,2LGm|θUqAql1CFJcmfjb턶.Rپ/4:T .(1,g6' gn͵H1T7~}y1X]1| ?s2ebڐ<[Dh+0O~(~;AN0 {J`}? F0bRވ PPtLd^$lTjc>ϿY}du2E1%֢LٔdJՊe(EWuyJC@j c ͞zxp~6u\?ϺjA?|?_M"fs X+"p7C[@O2Ő>֓*7!5\VGSA~x??I1J&kk^I;0)&8㒉w7ݥŤ.)Nߡ[[uAPhY%]YoI+ļ iywݘŸ ,0ئI5IZb(IQ˪+_zɻTIT} 7~w]4S-V瑋)\o>U'uWӗޭ6zpeUWD_?"\)4bUD ;[hUV xquM7*[3գ{5SuMe11Z#~]}}3[[avYmݴ ;H҆; r>4݆7F,3[Ęj,X4h82_=MxWN7uMwmzVܡ-]uR|i.#q(SMpʽr"~^hXǪI2&UM.BϿݏo~??}Ǹyoo޽{:Po7({6 z\:V7wا#TNH.g^ 0!+ c \ZG,A!:$JIhJEߠL`sBℚD{E촚o'6S<ƘV̻g2ukdcZ8ҰrܴWi C2nGCg\ZA?ٶl\K&vn c57%蛋+7_?87*q14 a-!.&Z+DKj =h)REKh)ɭ=js+)(bB.Kٮ }*TjmW/pRB]XWܥg]!s߮ Kv]JSZ2,a()ѽ?㯯kuW6)y'u*O@x<:c#x7uxXH[(A F:1\ҐNl<ʢܧ)u{-cwx"Ξ+6?TpBXd%7xp`$|dr)> ?}'9lU4-2 r2+)\=oP RZN]Rd:l>O2OX-4YNvnOy䝼N"NV=BR'<~\X*v*\oRb^}1Ͽ[0rP+HƒU#4;I ;c -`qw5OEKHh7 Ȗ9s7H%C 2*2 a,U߀NɥdoM)z`༷ 1iVg&rd.{gCOiU˾ٱTqݾ|5>,ģI=.i_vl=ZC]YLZn»O^K{"76QKn]16O]f}o& x Mpte4<=.peŘ--#~jVH#IͭOiy|?x3/h|󫛋?.t ?~1=߷rYl k/lF_/#:o.6wza[f򹍚ijr&@&El4haL4&v?LZA$ vZ΢#Sn_fBJ [':öcج J`C9B.KI`+ fenåE7Jږm^= ᬂ˄$rc}&ĮU0>ܖ{W5kLrwODy(A`{NӅZU"M }I5oEګx+o}}Ҭ|f~ƞנiRSX"7:s^,etrsrsOU\5 qvX!dt][?[%Z8b{N3E%7q:vry N2Hf'`N0Wu' ruubBq0#^~XzWՆ)tQORm}yDmL׃O .wNxWeau jMuJ"3hZe G\iO:gnzIk x4,NSv ݄ĉLflRי)qXe>œy]SiIY"ڰN:k_^͊g¬Iq8tp Þ ZK=AEv։4+fnɩyܥԜ(FѪ"0"熬+哉*CS*%NbyTtœs@0)RL{OZ ڢJS {J4edT9FMFH )n'<i.'#m"-OL;˞z6A.|c6 T&IC)u1g.H9-iiT ƱD `+͞"ql]ă6Tt]9e CmoKOP{j;ueҴPƢ6cK`JpYVrDE).&7*De"i/JJbdӔ,`9Gg$5ݐzٕ lLKJjbRB3:瘭7ԡ kUG0 V+KY@iΧ)-7IkK g ) A-Yk(g\}='1c{TqDAv8e"7(ʅQ#,9icLA_FDS֬6EF>}j{tN(8((W:U P8V 2j A:v)krY::#2XaFy2Rh,ײng.k'.]0[ie64pM@;β }ĵNe U Vmovމv7NėFVZb}ϴ,|Păf54T:&5z)uIShR׬ZfN<1F˹HGktؖþCлIQ R Y$~(!QO ]ɉM6 }]0 6M wBP.18,<9@Lo~pO%fOZ5S 4Շ[s 0tp>Akn=9$s|9]^v7?TV]M[IkR)7,djA{B]auH3y BdGdJĔ@Zr߫dN`c$1KoI_֜ph.NyO?GZGi#]B>hiM B$(", o#Ą23+)fрL|n5˚\Lu~9{& AO)>ݤao㊕ *^qV^{~84lqo]{&{Z6^ejھ m>NkIG8N+~< d -Xz Xh ;dz"dj4φ9R˥-l)?p)EEF@@\@R9G]ZJSϰV&$ E_zq§ipRie4:F*T63LqqEJ*&oRD6\j\J {q@xcxg\KW>jkm$7گ"& y> Mv]&#,yx/dl˔%m`ƶnV_ůR$9yBUI];[|Oh ,h킥zF!S"0H:k˜"*`K,#꜓j6q ֒#Նit҆ 'BDZj@i (.h߷= kB^ ۫*_c}Ƣh;U,D β(:\ " (/GQn:m)ePI,QaBNKx+g,jRK !r;\YeR>L x{|6/Y߱3ũ`/~ўti *WD.alKB1') sl y wK.+SBx?>OiIOoΗ?Ra6“Za?|4~FU-m \qS/T9B<Ӭuo]ٿO=2C-YZ=HXm4ַ5+FM5?T* uӛ˽пBu~}o/~_.(3Y8y7E5Sx|@]OZw͍xB׊|ـI5 ,a15Ϸ_uȹwC.L0:g0OPl4i:N-Fð/e4ޯ^ul\A h[^z<-/[cU4z;-éjROM3<7{8mcJ3I0 x$1#.n/4꫟FH}uIho!^*bF"0pB'f!8SԽx@jS06$E3J.#@"Ő&,p&Q2n&Q@Ӊ7V3V=p* ['[3B5|o.8P/zu kDJhH9pVUT3.!Bwo*~Uu9z1"|4X qZ0)x(cH|S@ *%֗IoQ7]_Xyu0]Da6Z iV_ƿ -HRpە!Tt1b\(5A//6wXI`tB@ #.fQZYET*E/h_ hBT[+rT\o`\Vh}-[zU^f,\>9IJ%8Z=*hR<&<2bV:.yR𠢷XB7{W#נ+P f fWv6.9Pc2 vORMϯΛw\%O_]\H5pU"*VYL!G).vSb]u&cRGII8%)A㙎IZ x_1E g1+Gh|.tS@ 6Pϸ(#WN8e%#O󹊉6o*^"gab 07:k)|]1^= ^PK>V8EOnPaV7&e0+Kա^QѡWzDzRrxh %NhCNvєLCR.L;D/`༷T,NH]rgW3Uk[]1[Gyߵ9*5 4`B`Jr0 {Ed?i4lX>OCE*{іe`=ddro"8hH4yAOkbBaC-& &qa)+q>rsw,Td'b*Tp+]۞:9iRa[ή=rjzxTzTqh_ruiD߫uHxZRASD{"L zFet1yOr d%x cEѤ$R)\B'U3qX4cW_ B½…D&^lN>N[a2ڠrAc_cO<:"KҔkZeU\j1Vp47,7`E=67< q&W:(ᕈ\  ª 4le`.իM nl^s3-gFMwd,>B .gu_*;"U@E6pA}~3=ñmFn'Ph{ dp!Vvr5mT5~zy|nR# u3`epd3 ՚1whm)W–6:΢fQhZﯝyQom/g1Fov}3ƭ嫻a=C7 UF %HARƠ;)VyRҢ\".|$QbD/j|!xG5r6!ڧLj|xu_]'"nt@UfO* _ARI_9=SI*zI_YJcwŴJG_̫7S@l9,1CVrY%Œ@L/ܸ񲶆tY<-[{ό7ϦG j0sQc>4#~ptb">hE=.l|C?ڲ5 '6] 8FAΆ1OPDEgHT(kû.$ QIz<N mJ`UhpTY}G\)"UjP8Vu5di#I S"ؐ5^gn22A*sTgIPqr"k@O4{#@ ΄\J2ZZ ҃r"ZŸJ_5@fP+?iV»ec债GIcxYq/Nֶ|Bh^*Z.k\#M̴ʡH +3;mWAE :*01\Eca/AEDSj~,ܗF,P·Lu@v Vjawc%4|+ +o{J%WQ;B6-mH3Ӳ!3P7Β  m # $|M]QFź[m VwdSZzH?{CZ#B@HIafϐeYe/9:],[ymQjwۺ*yrHK8F%Ҩ@4u`_'}ٛʺW?7L?L֬qV48ˊv/˪*kX6iOx{!ﮝo\mW9qJoX+ yp͡26TH(Ѷ@gim!,.$]1sD=lߑBi&wRz$L``DJ`)> d"b]:J sZMa#CdT9R"kp)"u6gNJP A Cv4{bʘ o@4? NY-A%Gh[jTFDl4/KRSbM2cYbHEUf'!џ1<<Ǟ _gYp&kkV,aO";58%R w"\-^8/fghŹo0g (FpH'(&*䬒/}!Sф>t1: "Y57_=)ktIMK]C.=ajh9ۊC6!tJ3-/aFk~wgoωq1Ѥs^ 1M~:?od5c| BI# a2,&^G Y|:_,'zv9]vT%uF]X:k:a4[XQ=9 n7h2?=JG'̎ǟ}?|_ݻ}wo?|V<gjo~|hh64Z#l0U-z17ǽ/9w,ORͅXnWaϥb6'yaY܎8GQZblj"nk4\2:ν A«UmfqtM-@T1BEN@&k/L?K.zJ9~DTQ[2 cW4A28uYj+޿U%O o,1Ifr %AV]b_\%N ]IT"TEՌ'^I,E|ŊjrV/o x|&ʜ;'5i/Ň}rD0Qh+\PKI-։߷xys,RΗv㹼O a ff鍏FQ4lR*vMѐ$$@4o] gi4RߚWW/yW_2x{ss ^sonB xm.\*?:93˜N;Oa&; y)DA!NmռWԛ|qO `mTZZo/($%#b**FW[88.j?2|^.mJfl~D>yFFI?%kjMlfCj\F{+blvn=(~z1G7q4}*:v0m|z_V ρTa&1PI!~V؎lѐHN0>{ˌG#ҞrFm hp*-_)t;Mcs sU|=ƨc $dչxIR Q{C ds)`nǷkg-砧QѠ^ {h\#z~^ LRKͮR^ J^ _a/pdK|Ӌޔ^~u B-!$%B٪·gmPZНtNdL0Nlb "XCb$x H*eAȦ\DLhFIL4'mG>f`rLm`N9V\ɜWpU*ѕ繱KBizO> ~{qow{^0so*rd+_]sEOYIds2x"EԡЌ^Ċ^Ub~Y\lz[^VW4#(ђ5B8IE-d_]h2Q)tR%/Ҷ!) F[r9 ٻ"u2D뢲k3r6\Seo%K&~r; 1E0jEe6 SRhؗJ(Q[[UH(mU-,yQ|"Sfd+SPI&Bӈ md96#~A>Ϗa)vhN랖m)LՌOTJ#eaVʇZosU'`vҘ-S˧%k ]IhPj/;M2u.vNb`%䤱KJ1h̑L[hѕK@L[ь( Ge,lAJjsflUVP]hB3'Յ+곝Wdf7dҼY:O.>&GQDVDX[R{z뽷8D"Ywţf甼֨ 4Tg/d%tJ6uB0OmITD2mm4v3rq:W\m͸c[`rJsRqdd?V:u cq$1YgeuJ`኱.F{Ȍ %495 UT;ۈs1Y$vɅX6#~};U1nFlՈFtF4]E(RNsX0qky[@XC$g#HۤBXuRgT=sLdJ SbOZӁ5ZoFE_ud8µMf\PEhYWHbYb1ŜBу5@[mBd7iЋO&ي;E'Pa] k2/fgt{; Wvn7d?L{z7tcvͧΘ" C]]s[9r+}Jr=dRSSSj֘&UI)Yז4FM<!/n}uY!hX§~ ūj8K4EIڗ9 ':4B&5mRu%ifw<9!jw} {sD KӾ3;vJO3IƝL=:N2=uۆ6l9ps}$l5h*j`yVm 'ݻ  O|R龺!Z&RiTH߸vrT1x3dK!!j4$lK]Պ #^*) pq2Q\A%2)tNV`͢Usǟ{c Jn쉮Y$yԴui W]^߻?67ZZǻp.':ms|atq!Zh;Pv~džOwղ<烖z:Mg>ϪD3=x{޾op*?욻5=]ž ,gDFֹ3qsU)M`f~5J\CEijI/]iYIܕ@icXzE Z_M`f-Vp++|+L¼p%ZU֘=\@;Uj^n?,_V/~uΩڼ\onD_-@e0!C`5kը!z1+TuI>?Od4e/~tJ*"speDA5EmU=ftoHS3[=^ nӻnPŧŢ B9;3>o.OFx-xwABtB|]]ŰY]YC޴KE r}UH7_&]͎~sbg4];yG=#y[ַb4ኻ7im}<@ޤ[1B Ϙ{+]a/q5l{C'^,1m0F]~ "ZKHiW4r Q E,2j\&@"SvgL{nys]ж&~}52;s9ULBxYM\\v3T@]2M,ˬF'}zC*$2`T!6ĘUʤu%KVdS-ޝtS-"<~kHM9g \y@FgQrIP͜qAk1'~#ՇTST)GEV-d , 0<`?"DsEptʄltT2 Yu2cC1Zse9qblM?` pf?꛽ƒ'Yf &"0g7ZG+7 *(cūB< l-h*M!lMN_ WjsO\wxN@^x~bߗEH# g]HR%MxҚ>#A#d o≵m( ͪ$(Q9PZ"ΉTxۢw WA M85ZI指K]1" 2)9į= x!50elzzNciX)Hj%2ow'0E"]L.J{++xX'i$͚'ƐaW䌬>I!bF 0J!B QȎ6deE0"_F| Dc)cfrNcF'ˮYED' ⁵Fka:`2?j[*ӈi^ (%k<662 gidM=6VMSb"uh0,Ճ m9 xX蝝؊qB܂N3 *pWO,82V!MM(tq2В9J,4Vt4/3`'-1wLՀ\I1#&-RAS@d!Pi,H(,@:6 Vma*WìIFyRU5FeA'ŜE1N0 p\"00'[`L6%LllK@jte `|@8FVJ (ޜ]MmB J8 Ls$eIcj g.zX"x5{VPeIلV$Z#2a[*m/ˑ$d EjNc`,) qV +\9$Vc+:c"&LLg' ]Ѻ" V*{$'k1 D%Ŷ;b:I "1 ӪI=ϰnWvs o&f=fTȻF'mZIxƋA G\²@ dnv7S0@h)7VSc2(3Hv9N-j,(t_à%(x$H&rZfd^0@,Ke:?m,[fmHm Rjěq+2p?:f,XN~d}% yrx*Eɜpic0>OOju_;<$] M"f_[b =N!xB6DA"Q @>1BHUW@ @0 hpYBGm` )!vp d` (N6G)c6Q$7k*VYNu:PDڢ pd..;icm\g 0KD[(&RPƀ^a% ܬ0Wc;"gEH! c#H>' fcI; 0]+ #, aDh3KoY5 B)UEA܋,R43߬tfA^] f|9-Kpam \n F@xd+ϷvO+֗CmjYnci$hjWWNb+ZQ0ьуT0vl RHxxj]0!Z_"e`09I=)o#36G _YI x Xs1#%1CK6\D[uR"˥P4;%d*DŠ22^ #FS')RC1V͛-2'ӛ [W *<(@b&c~;?" ^Z0R%p #ca ȈO\Y:@9E =S)3ޣQ$6&kfn<ǚ5"EH>9J3)VV]ΧIWMƀ?!\S VN!X2Y¨Q$&`َ\0o­ <`Oks!*VKo&b] 1 0K@ZŔ$"P]X4wR0[`4bMu5vWEXD3BБ W;gqjus yC%4N >7p9N~-bkJ0p¯"і!pH KޔjdzMYwy:P(+t!~[AVo+ |yR;z ɳpֲ~?D:iC,DuҽNI:^'{tuҽNI:^'{tuҽNI:^'{tuҽNI:^'{tuҽNI:^'{tuҽNI:^'IãnN:_/1Ɨi}[*ޕ~X>^2>v xZQ.[qI[CrNa^v KT.gŧ˛wޭ;/eZ].o::NTk}wXAO+~uS㳗˰+iV>_7᏿[/wEˏŃoyZ}\1r1V r}zW7h e?!vt7w<9kͻu~<|{<=P><*7W)\ܩz \zlcsxϿuGu )>J{qsA*yM0ؽۻ'^Mq5UWJMG >iװ\m9e2 p;oK̃;c;{r喿Gs=_,[3}=-󫄙W/ 5[u%,)tN:I'+&|@(l=Q7JlV i&6 >^ lyse$'FmrHmQ֫4놫x[ T%օCoMc8޶9趸Gs-z~džOw%+g|?\VG}:nm >wڽ6h+{r5wkzS;e}a.ʵqڮʍZD^ƽʼ6,_݂߻mN{#_|@x,<ญmﴔx_̳'g] u.|օϺY>g] u.|օϺY>g] u.|օϺY>g] u.|օϺY>g] u.|օϺY>g] u.|օ~Xv +Rpdɽfעk%KWhVjU:~@*y=2zh,3(uQs$uj= nTv2:k-[I̯!ƠÜ*wJ*"speckNIg֭v39ȧMn:) ƉEuJDJ"_|N:>j%dVSG:MxVPR_߯B>a7e}o,hDs0e'˟ͯC67L lds$#H6G9͑lds$#H6G9͑lds$#H6G9͑lds$#H6G9͑lds$#H6G9͑lds$#H6u楴TgoNL6T>F} u7Q8+\6e+DLy-}-F04NkZC~hH+}P_Y{A-~ŕ_?]7C ǭ_ϼ %u>,_gB|cUpxnۋ&_&AȭUr_+bc[2©ؗ;دnlX;,itp3VKwesxg:Z~iوe77\MYAvslp9| -X9#8Ch|鹞6ңgLtLt9NfWWv,twv!ly{a1SWNYJMUKjm<'$+5b\nϧ|>F!E-pt7qu)ucb~9?pc`B|l^?0n9}0l2?r1ҕRe]˘{ f+-`–qI(K` KӣלZ@]xܵYo|Ix0o5ԁ M0$@L I60@T;$:<*u]y3ۂ*8{lyIa{=}qϋN]0axmj՗3BꃚDY{N_??t^?QY'Z%Rׄv4<ո!_7 :<Oqc`w7AeZY LE#$mt*oh9>½Nf"4,ټpg67ܞg)E|L\3o Pz T:*5zɈH, m#Ǭ."j sc ƚ0}>f!z#n>CD0iSׂQn:%'vIuioIJ $Jɪ MTlbezɎx4vha#`j!BT(J%jk$ D$I "5$5R C ̹WMy6[StsBuqiЉpÄ!,fSC:W\y:5 ]Y8Ө%t%-|8Y~MuXMm'.o'Ut ෥Gp3760fa%TrkO`UbWq'^`~gdW2KBwU}"Ps7I[0wNj)!T]Wtfٴ1 yCV$Aȍ6^zdV`'K<.kXpB)zI@#wI -IKG!{EL9(-P Us|5]6>zFzc`c{d(]9 c-`\Y6šS&=duҠ^{wY-Iy+Mzo?/gxu }k@KSAy;{ N_] 1.~K;_MhDCR*mRQ ƢK\VEi6{\>mVZ"ƒUВ(11@lXZ3))|R06椼ʅJa3<|] EhwN+^? j`P1h՜ r0I"is҅R%-8Zd5ѢVqAUVWۯ~pס^D}+%]\ѵ=L~?3{H[FU}ͤF8 )o^] qOS6n~cuz 1j]z[u5lp[ v֕ݐJR7Rͭ蠔Z>t& uP>X}pk ieo_C%Kf6(ӈx9si[2bnX)7<Z=l b43 zZT꽲ZS ۠Bz.:7* U !&J!*lU}U%//?wI}!ߏ Cܜu&R+X|{J1 *9’$DF[ydW]}մ~=ܗ!|ϣ{1[(vk7f+m(ÔX~.3=-7fOc&jDaF83C7bKv {KHeՙY(aWFզs=$`K3=ȇ~hs; -mkJ^DKb*( )!o\*eQ0Jb@\Dі<_ ن$)&i[L$(ejvrVk*5[SwE/кM.Zj!\>S˧i ~>~+F %c", 9{]1h*48ў}u#2:ǘ'EA2D eLl=.Ep:EEwJ2ڑtGzX^,eBU9Jk[<dCO[laura1ƍӯoƾsN:y!D9)eVqe̓dgaѴTJHNUW,6dEF6A DF!#f&lV2 NĮ6O'q#JԮfC 6"؃=SeMlY<‘( 7>dсt !Uw.K]FR2 C* ()1TZMdΈH"^*QFHd㲋2d8'RUkäԥD# օzc89IEVj4tN\/'z_f2 ,pY]>zb6DOӇDz)OjLcP@.X*8+hmupi 5%@EUcEI7l>hy^ w Vi-yBEY*Ȑ`TG$I'o<ɽ[*|4IK͉Vi6YE-w>'mH e#}076 e OM*"e[xUAR8&h )@j".8&N`` IP( H܎\O%6!$ɝ 18'1PGq܋W4#4|Lr,Zn=!KH8X[w ]fl)?kqȱ3{3B$\&yfEǨuN}+52 J02.||0G].|ޏ"![2&kIcH)D0+t|.$|0yuW  ̊N[l@ քtK Mw3p-lY¢0&1f3Swrn+S0j`g_Ut^!$`pr6K5z^:u"e\UsyjtVOnFpVyWxu3.^T#%د3`>/Q/ʽz "@7Ӣ^ 3m0^2VH6>6-CqEdyh\(IGz'5u/X_~ɦU8FU')jI s@]f+f#~q6.K| fR8G9`To7ަ_ߝz u޿{RoR)G@#|iҰ^4Uli _^w10,n& DC͗W#f0q)yQKGHa3>US4"m~򺜲9򻏾`! t&f$m6]mb:cJXQLayї_wY2Jj7W]dG7 O Ji->zzLJC^Wޙ[1K?,e 審 hW'_|8h-u2ϣs袀L 0U2 Dz5Y!3K{"1L:zQ".$.jݼ汥8,7y-/jc;́X4T<Y9iu"^Pݬ{g7ףI|Oa/?feۛ2`E`{]6fi'`cv ^|ӧ**aW.EPUȮ%ճdWeČiΌR+z UUm< [5 {E `鳴J" Rxd[ Ƽ&+ʫ7|+TFen?yQyJjv@35E`gWCƝWLO*(=vePUV+*ҼgWϐ]1PĮ`.A®t]%(y]=GvSUq@  J$h5:JQϮ# F]]%p'T *ή ]=v%1r[NBݯ?`HVRL xbE6'Ds)#"Jۀ([H{yd$RO0q 5E! "%5Tx*l`@UK-7< 4>ۄ>8?O dE#\j%Y_ BL \~N>^ D'rlt]4T6J!۠LnxL[.ڦ+V6nQ[j͂;{ɰ a( `5s!Ɣvcm^[lе}!u%o`EsBk/ _ ]P,zJZ%L{hW4EH8I]OmP~ +aaJy$䠺\Kr'MὒԒyMv髫bP% EWaV"sS4ܬ",x@}mȆ}")K3rYviҚ>}m]ƟG 'p~˼t=yrٹB-",?ԂNJ'JqXp҃\XLu=Ē\[!bQX~nV5.!ɗ?G-v Agl;-tn9B0Evkҽ4,D 4GY؀hQ:2̅XJޡH( HFZ <B*Caj3jX$"﵌FMFScaᦪSwvGwV|>gŜSWRxk6r-h EŞ$miK590d6G`tOra;G&ybBȺӮ{3?YoIU|orWK*Ljd<)V2""&ZT5<)c"_or@,} FfePt>` 5 w=֫WTl+L]C& [k&^MO7,Y}&CD{fhp %J7bi1g;Ja8$23'͞?M@suVVx QfU yѧ}SzUOtY=e̞0{a.Y=6ɞ ''&1M|W05ҫ DC@}Hm[m؇ْm%ݟnT9=[rnK )|2Nv Sz53,1X q?,swt˹a;~*[-3;^S6h r&gy\2)Ѥ Lqhn7̣jiWUZ 6^Vvqp~^̓{7EM%I g@ؚ%38%&|zf[̾ٳ*֢y^%K!%OSjP;8oJӟW]^&ɗ6 p9cu9CA+KW*('俁;-}V?C`U*lS[¨vivtG.HDP4TI#ma^έ k].3,[l]u>77Ӌ*z\_wϣ4*k 62( F刊`) NP2mJfSfdž`bEy}YQt@!pz1'pbh9] $}gY,$~ ?}<+12(z6]n9V 5{WF}H,Sr+Xگ-N2l`Xb#*EF@qÝ kNHZ R1G EmYU܅1rbrX+H!mu[a8t~Cwy>,5<ܧD֗7{kкp_-X3}%Ϊ ЌBbKl_?U t&`mHi W P+_u@9Fr(zJ5J)XX6Pd2 9D[ `t,gpMd۫_W记:EzD. n# R2MFѨhJ;B4a/c)4-Ɯj \ϴ4Ǽb~F'}Z%:.?xۮ{]yrC}Zh,^^ Re,:u~/y ,)np09sH[G‰ʠz܈XGW:\W\(wp O=cF@A+M1H 4#;*7b7[>mIO`&Szk})'t?s+=m[缁U=z λ>\]yXO!W(t6,P+a,# d;=>o ~KXuT]HA#T( QƆ4#F8p< ,Z.B O+r]+lھv:>vR%T ðN"d|FÍKJVp:R )sjuO`*[UroBlm̼':gSMZwotX.>MߒE;%9T.9. 5E >fQqj+XEt4Žөv+Bu`4Q< RJ5"Pie4V:$tBI^.V@iiB$ I˙eFH8ӲB8+tvȂ#-/ $@!fmf^x4%\&%tO yz$gmq<:JKE&MCИ70at0Xr[5#{:;QC@ R4*0lN)*d@[#0w,V̠h$pX1 D$2 RjDs$+@***E*(2Lg!t.WPTN^F$`A:'L H@K0dž+k0ako.B'6IӷހO51k$7FJO a#&VHW`5# ^yp4]իZZn:NoeB+4GH:鍢i#Jz <*J5`xe .kiT`1i41)#Z¸መM)ɞkmF/{>0@?t@{bCǍ,ƹW82eK1tÙߙgED3y:VȦʇorK(JZ&ID>%rŐM8~ygszS1֥ŮhG3 T;|c1z>//\|K7e: G5d'߲Ʋ /+xqyc阨wUa:0KoԖsE1J]R c3ճ,_I!DyۗZ,f3s&үxt5QkVo{=j=+{z r_ZTj#J}gzij'h1J ǧta0Ng Gm Lv&@S bCT(wID}橂0]Xꦜo3هX[Wƺ_excZn]͚LI7anTݳv-}\шB%l*]Cڴ[Vb_4T=SXNkUy+lGhH$do0>zeB{)zC7 pl2//Ogo4R#[2"5zs^*2;) sMr=ƨ  W\$ר СDt90#ɞK4217f}̠]\ܞ И=%fHut<]̻oQ~s'-n$vT [sRRM};P(1X֐"Fr1R4JY @t/A,Q%MI"yv| 7JP4jF=9.Ҧ'K|nprC'yѼ `?zs|:c>qse۫Od3fk!?]s) 4:^FaMӢOl zURZP9Fc{9Z,bWC֫ g/k$"lNRQE iJ&!tG`IBQk#JƖ B^TE]dJ ^o5[ȹgN%VgGbG}p?FCJ "2D2))k4MK%Je/Q[[UH(m6JAY)$Ad%qiD6XQsjGg6K鬞g;UӡlTAвyhdgt|J*XX&$UⲼO~x5gn_ߗŦ_՟6V k{KXCI99iR Z(s$kiKL[l/PA0W(c)e+RA526#Vi$cS,XU,\QT6=ӴӬ3p0?~0ED='k[myj"ZQef甼)bZP1**lj뒅h!;+ E'$*"]a[nFΧ;G+L:vmڽ{gkцROFSm*k Q'"1G^Ku&XS6Wu!E4CfP{ѵ c Q*RLQE,;BJ4Μe} cSDt#]E(RNsqcƖvOB:6d@K_:;PL1!%+&L=iNפbI5ȹGK'Bv)LJ6EhYWHbYb1ŜBу)j,un:#nRŝw] yoV{$d^Lt2{Gޫoqc:H(т2T-UQ@נd!,%VD]1B95jj=?MzK]G<3'{x>Ow ZQ:)΢E&D`0]P0{j-D:֨s2IDfy}0`ڐlzo&'6sl͕PJ϶ Śe^_eNȜ,yp.$/ަAfeGC`O "V.i,A*M]Oj~ @]Ϻ<^mNYR]8T+%(IX#AT `?wǙ@N4yM֚1=^SR*; t Uf8(JV2b( aC/O"Q^MQ$g- @'cT!$i-eT5bKҺngIJP ^*&fkx-gU?`WG^E\ԁ)Aj~5ҔI7ksZ=^Ìqʏ ߝO.xu8Ms쿮9 Gex|v[/y{%PĢ6|:m;ѷ ĚpkGN a kVL M> /\y|r0~Ar0oRC;jY&5h~yO߲8ן_?w?> __SJ&"@pkCCyC[r cb|)q38q3AߍǓT/{]sW㠺-Fyd;Oiu@r!p75.? «:/>Ɲ>ukWs2Qb ˝ǥ1mfẍ D3CHIx6]oW?ǧcpSTAV#h*Y(v\0d-b靺ǎ,],]vA$XKL HdIgW{դ\|:v|ȥCa5lei[GG F2 Tm#9P=[  RAұ^U >bx1=ip~gկ̷|]1zW^:Q~YZj+]g ZR 3mjJcF篃ɻS(,ca, vX+a+vEȝw! -'Eh[}eZYԦ_+VZ# v]4ѳ:Z@EGۧn_GS:ƣj/M{4Jzz4ZXU% bo0ZJ%!\h췭Q ᪒kԾUH)Do]=G2{W`?[\tW,tWJ=\=C'b#k+U]J=\=CrJj{W`*Z;jT|pW1u__R=-<߼\$'!evi}ZnzKaX^~E;JHԁ s1#WΥj^u_>LC]jIIj\1)0,Erګ\r8DeOe0~Hs9,Lm(_8E.kSAZW,$|̿f6!LҪ:c]88L2vtYE"vJ;ԦcBI`>,9|1eV\R)d nqX?? ReTy92hK%q* @b2& dczYUNK*5׎d7BV:ƖXdL@ގ %$ ),ҖznF30PrlauF0L4^$ȺTYe>$ @KRhuN?J? "yA9 %Zɐe,Z訓Y/FHl"yŢhQc%Y2Ձs'D jLx]-KOBE[雰!-RB%6xP }}iU+-Xl\J~:@8}o丣>3Z)c )W(TMIU U^ KքV[0S5-$UJncI*g U*8F&+5z!GjM7sTcZUhf%lEj=j>~U~״u\2< n~_bI~f izX߽<ΫN7mTŸCi9HuxWÝ=o?n_O=3毯q2Vk=jnjw悩ztfhoXǻ66/YXJ@S*ڸ`nhXШm bp*.jŰԁqPN;=8@W]NԽbIX*]'Z{Ii[*QI@Jt s&x b 3tgzesӇiD$u5)֙1C]d-wi>$6R!^a-sr+#"}x6H:FdIjQ}ӓpN}j1oD)unSJ1Nu  %xC`oGboFCmVCNǴ4vRs1rƑۦqW1c tgACK}N߁Ye~z6\]Zsas7wvwan>_.vu*vhC{1?et8)d컗:V,fQPTunL i!XĕxI!j%'k*dzM'y 򙏻e[6>r}Ѻ{kIZ*-jR/sSWv7_<Ϝsrsz~krv_ CO6OL;WiMzҔ6_7l#5^5 tܘ܃%)ҝ&z4y><|':?M1m6ڤ=kZSli>9`c|:!g  KYw7$*Gg!WdRَ9.g ]; C2bL9ʁqr$VFZ }NȺ-a8(`1qwJ7""tG1̈́.=!]dPҒn3Zb*7r-fcגnV0iIC-iJגr Z ֚Ł |DI 3)$aWb)!{AYW|}5J)!m+B cfV8F"va$Ș`J c!:>X 'fab|V >e~e4W3Ru>V[mzO.|ŞWڤ9gK,2!Rt58ChB2LK! rF J곌`sy97},Rv \9,'/CRDa瘤v Jg*`t5B`H~JJ(PXQfslV1~͜#lWKL=3z_:8mJKlkSlBˌ062C򳕼I=]۟2W8tc![c@kF}HzM`B#W«& jē&qq{BV3m V h ]p[ Z^~Ǩv1Ӵ'P%YH& h,Ϻ(]uX*ߊ Dx93pU&BUsi#vAUOP] %(WG4 sѓ{s ϱxGrwPgu&ij+;&âFH[+NHr^)7y^JgUPL ?!)%$.  P8O@TsOP#*>(>'<l-Z2[k> X/4BQ PL)VPԪBXt4՛V+sV[ NeV"%cH,@$nDV|5:8PHN)O[&JUݹK_ZQW8ɀpgMhȟEjt Eq&D`!X2z<'9N) x#HĮraI֥NT> !Zu;Hi/[ w[6g?wҮMeNeׇ?-y]~nϮ4qLOss[ wc6$^6*}%cwٮw` -Ը*;Pރ}Y~zy_7L.f'sr}5=6%r`R`@^+*'Lτ{D~\Z4^F;jU)W}rat*I"QsDr T6+*[WzzPB׷{ɻ9ة{!ׂ{ZTs`_+]᳛Y[+/E dRrYF5הS%E*Dcq}#ckιƨڙ/U+x@Im>4gd8쵷zGnJXkd }XXxX׆w]^\849Owl0lv}v{G/K(QE DXTZ[\iK!hd\2rZ+5b&)=F5-lZ!C9Tv k*U'bw3pĎWyb%jwc~+!Tr\h+lMM*;e1P`H)W": jjI)FL(1J~1WUR]fx kٌS#bou T.&XcfO @O@:_YpZ$B$Q|o j|ŁT}60řcB(xlI_S˜":J\+ZgQrj\q8Ž+Z>bb]cu.Ĝɑ)ՁX@k۠mASe&Mqqq4락FX ga?wb< pz֩#dWJv>5ۏ9,w ض(9g=/I1Pڪݝx`Ɨx}gL ܔ}8M"T\Aɐ7 , }́tmI7s܌U n__|Z_e}R?slR*N h`aZ k.l $/*ni`s25A@ PCkmVea\-N! 7MrNA&7kc+UEr%[׶+A[%Ǚp$fDjTv͉wMcx<3:Sg&d]G!^rć&G PADG;H}-ML$ɝ(bqN@s/ '"E)0p _A+,7BDjw"m`mm 5]jmw GŌtNg󕫅Ha?H}[L|`ָOP qNw՟.I=0|"dȣc+0s 0Ex?J0z0uf4 gwy<'@0i (zpTa91!_CCrߖ?D) 8xp秾g&+;.I@2pIۤD4䫜!G{Zɺ]HhAIӵa֊]30U0>ӤtVnF5~<=Nx6]%FJR]93_e~\5*CzQ( Y ZRz\b XUph;vpnYh.ι4Ѵ0wi oPpFsY04\`D$4e%ek{O}MwVRWX֎j -%j1#vp˸ / E tn8,YL=@h0E/%%)U S Pn,,xGIa6jR.C7BŁ:< !/ʤY4\_qJ>L8"wmǚk˱ޢI;k1&`,xB\`9e$ƜT,meNaɤ7ioE"7$%FfN3oBwf['8J(7=KW/aN(Esfͭ$G|Ze`H%j+7ws }3RKɮ䇻9ݜXM|U'Ӱg#Sz,FD-i8FUaMFF7]e>$PP MLS+IY8Sw0&vvR2T^ۗϟyi 2>g\ $ oy@N={~{~Fj53LJ1Q9J/p$v"DcrGp3"T \EjuT*zGW#&U"W \zH%zp%8Cb|W@0lo*Kwe;]E*9?#+ɅrJrU$WvbwJB!\)'ߕX+ W/pj*R)!\h =H0䊽HyURZo :FT(:ծ b"~cBP/w>ӉM jRV e-޻۷eOΤ4zD{ɱ6/H 9Zr;Afx&Ac&u9L0AH$|iYXV4~tƮح._O Z~%=`;J%͗tp Nx"rrAy&XT>[x[7 s-cc\'qf|i4u~ZفtF-9x7O!4 Y")R.G5E~k.\/Й6$W ،ZlHL>] h!@`3"I-r'6یZЮUrǎJ'6{XCkːW;nm{l2#w391Vb3:=HJ7ٵ|Y1s0-fd3kyt&Bq, \/ {[7pG@Ι5 Ĕ0< [:OB(kB.0d%5(XIڙhm?`8:ў#M5uцs7rS̖D (DX'J`N^߀\]7dz^7x|[apx&G};ypN xC@vS'&bn2׍>3.TyUCu/1Z3byBŋ'e&uXyꍀM|M[ *l#*tr6bf+msm.* .j;X㥸,jkmkkm-9'غV ;mw%'rՑ`[MR.}VGjoE*as1zԒ=+ Xn`[pɥh_*R@6r&\1E){W@0Cro*KɾU҈oH!QW} ]{W@{]EjkWJ-p >Gd F!3g[p}༼Q+ c1•dD+i{8/KS?Ⱦ}Ӌ|י$keH} kfWKn"'Jќdsk,QV8dq|SIֳz_W'0 ,) MY!LAP a<0%gdg~MGӽpQ.0xb08ט@DirޯB4q\y# ;gCJa;At!R2շaPg5T#l|O*H.~se.㸘DMcfNv]E7Dx9KXkT V0b}j_/z !_BRh_}B74V MG𫊤 W=pm1; 7i}]'~+F`>mhfǛyXN"KmF7l3f<{ ysr° 0PҤ!SqtP)nl#Ь2N}bm\Q}8A~Qѩ@lSh^iX'g)OcPkn<13}`rqg5cQC" 5qA&_*%=AdF/;^=/KP/_L^z:QbRìxWf|Գ4vʡ`ZKٻmw>u/¯Q%\ b1:Igc4kL:amVbLGJ`GjSiYC߿5Y/Wed/cx]izn2 Sǿ8ECsOuIqCp+=]uc-uH Tt:F]cqGKFO||Qn0Y.v03:tgeЍ|гIegVBSOAoK2S$c O|sr`!7$h?ܙv [+Iki>(I(I KƷנ&6Th eLkD:o,]X=K8VmkXly?۩|;ҋRcfijE&}tA}U n-Fu){4q1Z7 YgOC>14puI];ؼ-?Ϩ]( rD'LJ!N /h ̘YpUg`JRuE@[JR*ռ@F3u.5ΩS7\pPDҜUL(p/?{Zv}&RaՀ.6% >ї-v-rvZs HɃm̶{5/"}pY7o>}ˇhEK{"̣,hQ` 1(HǽC"p@1dY{2QHYu2LwZ {- L Ma,R!-- F{쩰y&2@9^&vkin^?bգ]r~_t$s{^LwP[lZPv;TIuX&,O71C\0gcevԫUos]zKWrhtMlmmO4 bOn? у^|6zX=ΫI7cv.PF2rXisWd候ܭhWϱt^~`LJR^3>JUsCwOPu}sK\SMvVrРZo>j9&͎7>ք%{s;p &=X`\VS*-,_ ޴HVd/gE1V:snoS̭c d>ׯh;2/Jl/l}q^?Z0zĒ}Qb,+bSjb+BV!r7o-ZpD)ςn|QGˑbudrԈƸmYM$AfkbӜbI%YtFYQj~x Ɱgrpg1Z@!"7C P]G:lj==G? `2h2&U]R CSCWE%M!artŞ9mu1qUu.lvx:@DP~ he س8x၇v̇hp^a/?Eѿ bĤ`UԲCJQe&,R8.g͜ڠL\@o̳av}q⣭ԅb{*YrvJ:_k!Mn; nX2Ovs y=/Nˤ׷.ѭgv:̅ܭ>t8չSbɡ/SC}XYΟj0|d7݌n4B^> WUti}$LС ZĔ.1i\55O@=TS:?|}Ѧ{ziɈ%j񥮾n>W_Bg/GNv.{',s20_01B+d/l?7&E.L[TիDzӖ}B=kI /~׹ iM:\X%N^nL^K}q}8sbZvf~4VO,*~/@d2f[T|&myXOl -MdB_Ǣ՝[|sHԴP|:q8v87ᲗWw<2>l1d,>`UV€JW6zgQTDp\[d9,0!Wa|7#᷺ꏿn<(r*q77:[ʎ侮ijB5z P@Enq]x3%ǛJq4xxPA+beG κ B'y j DӥXs͑0F$ c>R9]Ԗ Wi%IQrQpJLE}rR*}KMEdoI-ǭOOΕ5`}ĎQXj%ΡZ4FGds6P v1QXuӉvܻXm; }U6 QEiuBk؟tjXڲY:1* Z)k.e9!HqbTOjl0dyJLlʕY3,(u[QJ9dok߲ fOk䉭eRP* Ah5O]8]#bP+o \֗[Ks5A/!\zܛ[[w[-WڥSb|z(n&:ӜXĐ\͖1P9VXԔ9h Q.t2"d,wG97n'6f%{ AG>{Q8ZK T|IYFhEUEcJZL 1bPlEd5vRaU u@;[c23}^"o<L fGN$&c[Phq::JQJ\wkG/z2&O>O@bD,ZI'YנLTRqUxA|7xē6;2( 6cd+h J΢AC|^~Ǩ˺?.@eAuLɀKE5+],:ք*ߎ喇Q$z>"u6u*H­{DU!9gZ_k]$a.z2~cC1!ZGl=n~5Nɐa1dSGY|HM^)AKXnfPSuJYhg! P489Q?=C hT.ZkБBYY2RΩ INAzr8'>`MWWc}Э s%JZ ٘ p ǀ>jl*ʫb };Hߚ528P`VWӝ:SW8c7o+qlw~(3^q~iXԲj_t 2z` u\ C5ΤL}v`;rlpbp/G|][r@%:YHb @ m  Mת[F!Jx m-.6FfcsG>YDɬ,|⳶,t#_ɸl~5bwM^>yۣ_>tujˡAMW+Z wǖlHr;?ݾ&?mTJ>ٮw.Q{q*[Pޣ}ً~z=>JiM{W?mhǻhFMF0C?SЉ(SMan W2g[b0kqT`5jv;?uڀ͎|Vòg,Z 6vێߝ][Tʷ,?j, j0E\`wi@dvdvIM`FIv,odK%D[!Yk,ko5p%Գ :3RXvb+ _Ͼ<뇟cW^Vq׏>*Շ8)*qIP&s FƆTlB@;}Wg[?r5ԞH\kJ=!&f q0;^с` %TU=S{6wptO^oT#tct HpoJZ~r8[1 ۅ&R-/g0?E#BlUl(%{VqӞύ!(Q 5^6^Y@8. *i{@ȉyP]ݩ|y,棳/>} Wpo447k&8[; B^h"dNp8$x!ƶVygw'pVro~ˏ<{M)nVyƿmJSAǂ"`Tcp5XЌVҺ Q*hcAс\[8f;n\x닓 0A Ȃi :VNzA&ay rI(L/KPg \)YRIGᩉE xOg!(\Xύr$eL*$&<#:j>1cJ*S:28qWYFxl"Xb9|{{GbcmI7_{86K~l#)5 eum{uuZ1t%nP'0"+F[瞎5@-ZJe/Fїn(-bR9.DxP2#:j6MoO\VvH ̖RsC[}G)!BSwUq092/6g dc1\s#SM.:Ta뙤\s ט-&0$a KĞM eо3_-m?-->+N+4~Iܾ/Sq)0lrB(ΝiS6 |cw8fRj,FwR?^Vqwıjg\hgUSf?3Z>QJ~'\߷VJW@skp @S*^w(jj iDWXBUF+L*-]#] F 4cUiS*2V %]I&P42`ќU+hu+D)HrJ1#hB!++ic^4hԝ2J>ҕ\4*Ԏ%U+CWQu+DmjpI;ݕ0 g"hEݲ"j@W[<2]mWGGB9]-芵tkS^Mr J62\cBWV0]w(jj0-=fYw( ' 3;L C1 {irV.D$;Iy%Jns],bǃ^<$xzOH)>ÝLsҚ&YX JP\Xdp ^kfE4 S#jpw蜋V)B/ gsSQQ  E5WQ4$MQ4A]hOp̃.+]q ㍡ A;UFE QRJZC!5i6*po ]eNW-]!] 6o]!`nhc W]eP{*\hjJJAU 3hN*Õ{WЕR4jR1t,h;]eFtt DWXxWMT]#]e9K3\٘P;e֝2JfZn߲ьe+-+9vZH/mRI·+ծMUAtE$9tp)1MUF)tKWHW{f-߾(KTEA2b ?yvp5eh%}(5i=?)Wut+pucwRȖHݨ6L72\ MV>:QֻG *)t*Zw(5kjJnl]e9s?M+EOUgЊ+%$6g rۿWׯ^󇣙q&" R9[M&dn}I '`LS|P.D6.C|;{3-9N,ޡ㤡DrO">y!&1naq4s+^p)*uCK;?̟Ӕ/SxF{˦|*"׃PV9r /uư2M7/s1k8f/K@שueG_Qֵ4w<|NN:ggpZCZ|ɔTcZR uZg] Y-& {\Og;8"Áp@ |ukM'lZ$s0;c$:'- [ō܀|ۜvx)\^_hancN&YqߞĥE~p ;b79L𹼀3!1t:J.YY7~6?͟G%Nof[zjG;46vkt[|9+W4-¦='gkvOvrH܎;Gxɿʖ:y WO9a [iLFgzgq<|;nAͳ.}KbͻZ,F-[Lo{S3wQg6{e9cs0O;DyrqӾs*$:&t2 GQ1e_|p6yJ8'4鍗S;n8tQ-ub/uN_/.%⹈9X²輖:cO+ o:bw4Q7)(7hA(eh{P } h*Z!k磯\0&B 3[HJŸՆií+#J7з{UV3@/--?t18NNC /p N4SiJpEQQKH!Q@3L(IaMb% zQ=2+RpBND]{+\=gѥ^Xh7'KQ2jW"1< p>.Tژ}*Ri?~47k&86:G)B!CtC! + 1;CEoV"f]%B6ߌŬx+ј5myp cxE3v3pv juWPF$=l8*'hodX g+HO'=q`ZMMk YdBI D&dDKP,"%\ P^[Wvy΅E}hc I2& BO5Zks 1%j28v뮲P1nm,ҍ/hoxz'8O.)X'8C΅~lGr0+w}n{nϷߙoݞ5n{/4y=V0?͞D;/gZJe/Fԗ]ִl1{1`[{` DB>TmAL&ʌdtGy:TD˽\\VvKHݖ8P̖RsCq!B*8iJF &%0P2bFXp\ ouŽ3I͹4eɬfK5J}@ WdqqKfȨ)U&$8L.1ٜLk9W3Tmd&vdUaa- ue,T>).$*vyU-bӛ&Ǜ\O`?>}fF",NN;TF%<%)V$ng$f7X{> &CRؔD AC2%ɶce1m6g;b0-մPP[w1M0٩:;Q #x6dNX( t]'\V@21CdEGHT@đ0.J,QxXMxMǾh*#CMkQ iv@x@XT:{=3BDgA53 m $wY. ΄Ԩhc$KZk!BZpҫf6YMKEWm.nhiQ(bCVʄrL,- x"1HmY5`%2.>.v  lqB蚀%sVoXKtv_&1.8efzcY i hjOk)^K>Bi9$pO0$$ҏs]mVO^Wtg2ք E0Xb*(2ڂxA>dBZJeu?ŰB="% EzTRgh8KpRm_M5z~0 |밃3$J=?]#Yz7q;ޅPFk^탨h]T!DpZ$}`B(w"jNk\eEr=?Դ@2* dTJdQr*Mx,},, ֑&EAyu7}-PjGxr}6(,Ͽ\x\{HKMFx/\ijF|$kxG/{m&8FB|T 0`&M}u22Ԋē >VNp-9g\2qtN =\#?y\M. Ϗpٽoz aX(x%n!nǓ};<=)ZsE} s;׽ ɬg?k͛fl}|9ŝJQ>=i޿pk[ls\g%&;l[5Ұ"hKW¯+c~{" *m7TI'1?~o}OO _ޝ|x5:XȮA3h~zB>B;M7 ՚MK ;4EͧujƵe?.VMK|ͳ])?74,2~zi4[aҋ*5_ĚjJB%X^\ectM<_:UVT3ʹhE[i㜌2?Y0 A^EȜJZ"ZYr}uLpV`P)e{2\ , 0rg=wdM;sHyˁDѺ59DNJgGH‘I%ӈ0^NF.E*P8ȳj:W;uɾKQxX_x%[Lg|'3O {_q4N yۗ^p+}[m|}=׏ԓ ÈO?S]I#z%sap0jU9'iTt # 4I%Bdȃ$CH {!ˀ#$SNNnȨg3\t±!&lgT9gMљ>Jѹur쌹2椓?R/ta}WJbP$dhh@рdqFAi!.N-T7QTRa=2pyOw;̥?ܳ6>ÛTjZ@ 3.\?'l]GoS'8G]uѮ{wp/~-W\>*o^x W-\jiE6Nz!Jꉳ '_~V K9j-MFdi a,Z2ӑ.i]a4F@B"& qO{o]tPf)LT)lLSLVХ]vlw=Q:mJΦ-UfݫwЍElmPHs \usxݫTʹ;-.l٣{J6Tr~n;5^lu|Jnևhѭ7ϼtP9|Kų'vܯ<@ޅj^t׀/PܔyΥ_ǟ/rXku iHY^HU0(Bsrd7ޘq&8ȍ }~ iQEae91Kc[!BIHAhQdp\fQ )yL=e0W:(X;km"C?hdjl [}H0zhB60IS/^Z3g䣣/D1"|N1G c؎i L"drUq̄u)udH6i2E6o3 )&H9j{>&ZN^dE9DA)a:L$ݰ\p A&AHMʀYrK2,)̅Ӣ,{+td\e]Mb>fkV`Bxq9/Y|_N:/,-ZR݌LzT7AHV9HѢVΕ-S<IF2+x9!AʰC'A35f,029k BH%1rB"QF" 6+4ZW,5YX`0ē`΁%1RV^и S"{>UgjD<;%NglO_[ Tb{z2 5u m9s'qQJswJb9 y^ ΄Ɖ$*h},~A,uc-t[C2 G%K2#舉x 95 84Jp&4dgkaG&%I模C*W40TӍ|Zc+۵b)m^wwtjwyr'yFtk&yVI0)+ < tIp E$8>Q737pVNvaLd@h![ cG%yZW[$lCTu RF/W5P5c DI!b D&ziԗLyLW|5b5DWT̓DMI0Qְ` /e,|w%؇Qsi%iFHg"Y2yc .dYNi6ճ J֔ABP^9BB۔TaS \q;k#ingͺzaLfkg;=.m!)!uJ6Pt6b)#q1H. ]\$U Poa{#@ hB@Z)RQF_ =(6'U+ j5Zb:'mw}ݭwAx#T@ spw r8_HQJa!d1&c|ܛN>nyރ xezN$̤ 0x(Y435 Y /ZZݎT/z?HDvU>XǴ-$DT3%iPQ(,lr}'UXxc"Gh0̼LBvX; z k |K߲WXAHTClMN>UmUz4̨C15@޽q R0>h;!v[@|) <{{<4WrZ+F}l*dpJ[$ScJQq528\_u?SoN.lח`P8t'/?gg󷳁Cؐ}^PIZt %_X߹ӑ0?^V*=-Tp=zvM ܈Ըwcua+)dٖnVNFaՙeKD5u19"ԒR6۬6ԗo"lxaEHa.LjMxPHDQrYd l4bl9)\{] $}%!ܴZ%92N@È !*ȢfI#4䂤.Q(^yV7YE4GeHQFndudz=#wcFo0#TJ=+/.B^ GE>0l;)C:t v>)c4v<qR#U>{iEP^;K";#J*h!M$DK| ơT+#6 r5NQN$ۖASKfXR"u2D[b4ͺfӝ&{Qm͎v5(B"SYf\JE&JeQ~:oJ(m Jx% OdY)$Ad%Av&GDRcEm֝-",:e̢F{gLNv,s'm;)YԾ2yѓGxoI} Z]ܛVұ/p`6^ai/._#JDϹ|lF{ k{?Uc7m)AȲK+mnĺkǐgk=tPz,P|)D Y7#)֡0YdaHg]VȸPBFfj-t-k DR)5`O .Z73,ȫ}eIl魹.(YVvkO ~V!su1$/QFoS 3i+;jkf]YϽ! 7~EMu}aEۜZRFB‚NB¤$PJd #M ݋}>ł{L3Z!NGeۈR*a:ުLT+*&&J:lp]N*r⽥G_QڟjiF$[@NƨB:iId[fe(RDWy%`"^`UlŪ^gS{͎ OX-aI@bNb0 tt{wݦ=Oi:ogGUhvQ/u3;?C< w_ɛ,~y%zՏ:e-Jek]MVoVTP_'ƴ{n3,_d˫ 2t,;rn8W6Xf?_^,_fr4hI l - 5~1l(:u%Ϧ]lU[]rաbꢞAZGjÆT}`U&uGP|yHqÎ5P2Ng˓t臗G/?}ë#?𣡃? "ZR8Mbj8i>gm=%bҏEcEV91H!0qA\pRƹbIS-a׫}!Y{y&$}IΧb+1Ir %AV)/Xr[ԩDI(sFG^}`"MScm}&߯zr׀v57cmo>{`…z<]D`q >,HsEɧTLM c3*g$Fm+5E:\?Hw{NGcЈKnk|;/o|7ӥf_f69Y=&$`28C2k fUpS1, iqɸJ*QmnU4Ij!yФ@T{{]Ko#9+>GhO3sX`g>mwdzbeI]NI%SR\ ;)*23"{[Xˁ> Zz3^N׼YHooO W/ @ԑ3EDMfkR<:"s>1Zy~f9k/^ŚN5[7MPwYB Ĭjj'\ySIi;vwq}w_a7 򴃛jeY++DY.:Q`@ܵJ]vזlYJdeR(RHl$/*0Bj, f%FߍJk&w)æ0i&*e(gХǧ9=>ϵЀ\?jcƻsFp/oWt3~Wg׼r\tpΓ/O4>8uF+Dg3:5RJrC]8~" PG e5 6<@.S§jʎ]%EtJVځff,CfL秢4n|X+|~2s9]&t4FPLa}%}ɑed%)ku*2Q*A5Kɀ6S >~Y9;_ݶ &@;z\Nq˪ 3$C{Rp.>Jb^Uy1z)%mF@ LFQS>&uɐ@,^xkE ʥb=`?T TU :mhk,w0YkŚ_.uWO Eq1C|N?Y!Mf^FD-mz yٷM9K mG E4bBmvHN3(a]k(5[鞀0*0$EtT*BdrI*'ʞx6Q`'&EEdp<>/_DxݞD/=0>zX㏉lu9eۂˍs)$v>,NE"EbͱdF2G lftcYL&zTBtN),T% X]_Yo@^&J|&e+FTQJQ#2&X+{6*kވRh*kFd[O&}ۛ:9M~QgKΕU] (7f|]k .; gF`.: ,$*LJcJ٥  Ă.&F\0Vݳ{d&]>u=nV``-3[??E|"؁Qc_&kwL[dEh:PZv$PH! i.&Rm5 )|X Q]=w=F1k&od`dɊFfm0!eO>k թl Cs'S>DLy~nѿi6G>^ϣ~{w8_\z6k\ŨJv %h3fkb% tdcƖ(ABd3TY&wèCYqXtVB*ZYpQ2db ɑA#HMtPe \."Tg2rV2^4SԪx>5#g>M <_ͪEE}_f8uw+^Z% 75s?/<^YcJ֓ѹEv2hYKJ *P`#$#ֈbF湝_}Q5f7I 6ĸGZixZfaӣiy*^峖GVAn *hMq\Xs{T 7Jx.VA0uz E tT°G QIg}a} -zdvW8T0E-밝eg݉lZ(*E?n>޸EKRXEV>h P^J҇a2mM¨[gA!۩y)-mR>O>qs9̘L\) ƍ|KX\ Z,<0IL~B=hN;4[mkn»رd}&Q-~wڂ58RhF|/\^keL:+ȧafjdQIZ(ؘҪR468-14XT{.,*3)"U-K!g"M:AhmuvoVBc#!lui,\pb 2*!u,b ,SFzK(ye |:\E|y+P;S?w{V `uȯJ ` 29vAx3Ia,dN&pG]aT^o)H%bX<,8.k.[Axcڒr(rF8O+PhP^&$~Y"񑭵a(m<"Lij?p+|X.l,X"()Q'`\+&ۈ"|!ArxHG3W:t{t~Ճ5EndUQ@&,d%Z{FL`)lƙVr:qZ#>δRiqq)hh?ɝJZ}9t.,ɫUE.̀-A͟qONĂ ;A ;5Aˢ֤JN^kHeU!=*m8pYhT@!P9EAs9 (oXYGjU!ٖ4%H0 $cAقh1\Ybh",N,Mƛ3սa~G8 yv"WURcsuRno%基 ZY".G{5%/,tu2*Rq4`_0`X~~pnc3:wxY-oӀ$f ML&<ɬ%HeYK i+EKE}8/$({)r <]Nn\ˇF>|ʓ_LHyÇ<}߫x7|~8+(>zWRtn/|ar ~L:}ZM ׷ງz{rM{\j|`tüOgUeAVrKTR vQJcTFUg*l@*膮Jzirž8uU%8uԢ? Uٛt)֦QH]]  ;2}bZ p\`0To+%eY(QܝBm]:$8t:E#x{] $hMFL@rBkA(V:a }ȨSpTo$OQ!bfWn X*U2;,Vmϭ${֟LW[JoylQpۀE㼷=%:@t i20exHY=Z66r[ Y`ڳI]%nm*-:R_(]HA@ʋSa2WW#g9_Ab'Ws#1i,dr6<9] B{*#m RxK!>BsOfRi*cu94c?hh`II(Uڮ($) BQE#dC AN#*)m~ʶK Ybp"KUNȘ,% ѻ5\D',d҄(I\} L2'sNz.*+r>&=H|~dySTA KtIEJtB\(bR–Z4yV\j,Tf):peQe[zSЋ#BEbTZ:a6V^٣ٸR$-ǥ_Uj|r8}p:j6 zm%ƒtsQ c{KEM!#GU;O[I˟ H'LI> WZ81xL``S@UW2:)X?, /!`,6{SJ<1ٜLk9W3 ڒ9%c=RO⌻B]YNT.)*c^d!%fkw9$N͓?{àgG,``E'IksZ*iсg"63)4mUf4e'dB"66QCАpduYCp:G,L[hZa4L+*Hjq,R[wRC[u` R*4I 2gI1iC65ۨ䂓1kJUHC&dȁ(ѐ!@A$p$E|HY%IFkլFny*E0DwD4D$6g^@(TM^; y@XT:{=3Z#u}-gL H6\&"9RPF4iFzZX[WH.N:&[g5.\tb'Z(q1!+eBJN9&c Zb6rhk &urrh익X@-6_Mx;O gD9k?6 g%{?NQ,F| h 4YRnvzҐ)nY]tQPpȅ`IiBu)DC2TW#gOo%gR)JDZi'q ׋ۻQ.ֳլCt,x@]Qox-k-K|Mr3F'#cw4xҴf,( ̳܋P+NoEh`SF0FXa l,>5ZHXR@zDGAܗǖJx&q볨<;b݇(ZLiq L%$pmҖ b? OښYHM- .VkT"#WIh3g(e9!ַXR{X Fa ~=B'Zrm1}^"b,3FH]8F$sX]y,Ʌ3{sNk{ʴRK Fdk2Y_ 'ܒ2PSLp%'̐e< dJB[]"%DȌ-wKrMH?i"8׿.g#0|oZKwxջ ZIu*Ft|:Ľ.>4y´R,6/ix[kƧPjV5[bf[sϳ;\N/Zgc Kb0gogs+ܮՆ?\N䄯rr8| mI2'Pt6v,oh!&ZE3Xfj1ѓ>lU.Mc% زI9)+bƱ4}9!{"hvHqC*m4'1k?[b?7߽woN/O):}uD30X*so A3~o~BރϺW?5T[t--ѵ5G=*ZFoXg)L9޽fɃaGlZn I/Λ4V|fo7U?VXwߕӸϋ˶clՑėe҅?!y^ },H-,OL)ޓ2`dae=/YEȝti阬 #h]̚}"'&W3d#$HtdiDi/)ɣ`U'Ek; Α;3-O7:3tqjWu>Ld^H]8g!Mq?^W36U eH)CgVFg݊Tw.JHOGRJŖ)YBrdsX0@ +G($SNNn(V8& 14͗[]Aje|`T)=W6#$lIiV gon^cn80܀h Rv2+qeO%|[ XE@W6ڵR:oz$J!֫CY$x M73a/t*u`)DN.8/1a#wQS2:hg (w]vx@N&[ab=nR5n'L fx* XA$chӿ \(E}]9 R$ǣGU}΄wR~;F"dйEpv]7>fnyS.(n] , [%炔LAIH%Dc[U} kwGmHȎNTҁv*7k(vMv7tQ ѿR4f: zSii@рdqFA^:_QBqX=d3!?؉'"}<1;񆭴TcpaB;$`vMH%3XꊖI5vDEq-($neJiG .Y"<ӂX`&+ծvDfa867^fmvG]x%&r+iR-Hm.|Bۛ."c.ocnvfi:.ّtf =Г7mz?kݍkAʓa<mioqw{h*Ϭ-y;Ng)SV{c[}67LcCq<~tW~U9A[ͭI7qo^h m@Rgxq)RݭoVPNu?z.\Ċè.RthU$FcFfSR-|G&g~zS ow-xXMt܏f^!8}8 dӋc ?x~;d&~&'ȕI RXgL!@4$jl&"#0Ay{CuV$0ZI K+I@U eE˲oS)Bx}}i{f ʓ?=|y3.^oQ]~~9_mqfm:h!Á%(-$:U]D15)yD˫ygDyE./6Zm-d5# & 18X z#gheUlvD.N۸pOC?t+ppx^ܠJV4*mE%qLȚUm2ʉק5 ~^ue}8g -ph`) fv'+ݩ͍섗z˓x 9C{a\Wc mRTs|ʸP*k,gґNXer]̣sR0~o9$$3v=6ȍ`c휢'tYrt5-k-OܱIKʕr7sfλbK-RyC %yH3r[|y֢2gձ׶([.X &*G):8B [B*df8{xcHQoo_^ /[&~F*F3F6J>xL%x&Qe>Gp&6S03SظTL=Rɜ?Qv1Wθc+.GZmK}FiW.aiAɯw&2.lv~ZV{.1xiZM&paj\-@.h$&vj TuzU~h5$R#l3@ XU\Ait6hV׎7ru}9i% gC[\cf;>9~(^P3z/ `&A|JJ~.CBD?\&6MeJ{K\};,y¡+miz,*nL[-I\@OA;0jE 6\;-(c-s+7:*Ooԃٗ.;s133u[y{րŚ8R08yb5]L dV#̉.) p.xP{V{]n vG? +u{H п̆MҀ꤄LY3A4E `TӇè`TgS.v`R<'!M6CYW5pbRa |A[)>!9CHҪ9(Lka̳jwn< Z = :外2Ayi[omڬחia[~G<z+>8/Po.QN.\dB jG#hMGd4(-W9bn{p[Y|.h~~B*3~sS|rqFlr_gcVNNOLOƤ`5gVswjj60QŸMA4pTַMU8MqBJTmW`dZaCB\2*iցR5̾-z9OQ2bWzO3O`cڑi&AeRy N%Ó>/}c3^O;fnpX TchsKkTl=s`u6]ԅERdTM?TONkfWTFϔ}0s{2UI//Ⱥ33"U&E&ŪwWGT\iuc@0j_K,@ QAXJp+&Rc?I^p>`^CvЊ Ҷ;G ^NfW&SցTMD'h#MV![7ˎKFgniy)UDW $bsu"vAEE [,j4!kD."viBT6F'p.18JF[;ou3gԶoo+@v+Ӽu҉|~-  z+nu,m3 UOԮL1/S<ɲÇ7\R SZc+U7j&c*3Á^G*:8@!<1` bR^jU\M˭Đ#M #r\5IKl sAm9m:VTg͜  هl_xz= 99M T.Yg(% o.qFˋ/mѴ oZQY'Rqڧ9K4I,NY"W:;j7s6$p9XZ.[Mx|PV7ۢ.OɺPBxɫy>Z9 xJVC{MEŮQ@,ӭ"`Yb 9(:֔M%qʳ\\ns̪54FZKJj`iߚKZ!QZ;UT͜#c?/B ݈;7vuݐyM:3ϳtu:['LNO]|aZbZ]iK `MѭLyPuWjrً(liaӢO WvB5 TnݶNJy:bYE1%jwqD`mPR2cMCb &[clD[iW_!4%.ą`SWaȂ 5n}S$Jh8J"JBt6!x&bͶD5X/I/7~xy81J)RKR Tu!EQCi(Wd@:Xi\dؘ"jCi7FSBNM򠀥g\p^CQRţ-i[WϜj"gR?'_C8꒯5.V/.zŵ-Rd&P.# >qm@rw\|A/m5ͤw;l;JH^y1ܨuK8q)Mg Dя^D?OI]g7F/ap[mB-=o0nU~`'p R \(T̠P|5GM.>37?;ßy54'iC7yxC~\WgLJWf5\%ZOUuF竣pwzKbTo1W[߻u:K '{Xbcuv>"N}DaoTR+" D8!MZj: PM&rIΉk== 2FU*;m<8v(IF!tAr\hkl>6Wx-W[!{}/7TQgh:\,50Phx7$Or5:?yeLȔI?n\ԾڥX >!;ɞ28-% ;8# L\LOTw$rVQ9I(OA ܐ?GpZ&؝G6P&8_E֊%7jMe棊 my8LFp7i$gG?8^K(UsϽ+Ԍ͆g7X>cIepvOXsji?>-7{(BVr> ; b%>rJ n_t2l2ܬ\fY> 8: w0b>G|'kOurUF֯:UzVІ,VK8@h\{'R/_nͮzR!I%=?;Avo?:>2}}O?Lǃ` ~|ihmi\o4װҒ͆__66ڰW32{; Z/`LJzv;j]DdD&W^'%F9w{N| zjy~Ai:x947VY6y-?}3-O6o C5 ڰLPANBe#8Fk03񠷐 ?(8',|S|QZ,9Ď'5l,%/%FV%#M !URݑ.C=I7zpxD!jQYkIhK'()/'I~0Rӄoq#* (] bءkt-ǭLYϙds1us o>u{%9/\K$7?o3_>e6GW8~<^?=nprǤ1wTYO#+9/߽̗n007v}3o}"wLT>9Hw|xפL-Xs󡛦f>>rc\xe>Vl'4棯%i>k mR9UirZEE}R:MU,^精oݳ=h3ב2XAvr-Ɯ8*p!"5bQ&FԪgse-|› ?M֣/\pT:0~}i3>Q[8f(q棎RD!p`{ǔk29(x!E#D#d^nظ(@1iRXJIj`G>'$(ϼA@AN8iU1PK@D 94"̤ Q娀Mld$4r !WG'3ٸQA3p$51Q$pL<.ؤ``/6VOqoi .LDmes4*k[ƅU{uPʙ\7{J>sq_k`sYMXOLEȜDYIA먁ĒUjt ݩŲ-g},?>ן?Kͨ)TuOltUoVͫAormw"y4:[lL=O Wt"}?۩ŏRǭȻGUk\P6lM +@%sl TJ6P*@%dl TB5,,yK6PXJ6P*@%d`)A!&7Wණ͗6Vؼ K؉޷w>F^mEsC(VmD2 w}pב2XAv Ljt…7_E(ElB@H*W3;iD?l &#wjp,xK3uF!p`ܣ i\:9(x4>IݍFȼܰqQ E1Y(|qg$CMyv4 (JGf!tF#e4KPogZ?SC'YT0XBki5!ѝ5.kzr5,>Q>ح #wZjQ;ezgv~]?c>=Owց12:7X :>Z_%/U|܃SiN֝uZkfx^dh^+pC149}_w}?Zb< hxp9 9W'?~}Q Ƈ>aR:`L. 2 q K׿'i=_.LS̒aЇ6n>@TٯylȜN7 `fw%h=$yg]>_<֣gmG&1y7]X7MܲpBSw`?œy!\ha}='1.zJS̄Dָ9rJ4-1 F$TRZQC<%FJ ƿs|T'fk DTf kQt(YoPY0"lsC%!JҫY˫gjffq!+vvL*۾b{Rzw JufIJΉ*z7gڼ[U~nb1b$:'j TbЩs"ppbn{9vOvc'-9୨xj |/j0 &k`<"\.Τ,`D=k6h4ĘŘwX.Gkem7z% +j;vbUr՛iU5bU*я ÆX^aMsN9$2V A0:ı9<^| 3)'Ml]D6&B kk!gs1 i%0PiFf2%#>^j}l4!ԺE3r?="S_ VA3ϛ׬gZjKY$KOq_Mq;NqU")I D+ڧٔK` /e[xt($j5QN#$3ݎX2yc .`Bf8gKG'CX9&!MM,j ˸t`![ (YƔTaU \qzE𺱜5#gG9c3k{Qk N"Rc eLA*ppw'AQZo zb&kV :ʈ6E ҃sq"b\ %]خV3.4I[=}.o Vf:.CäexY8_H!4~Q7㉷#]i:(3m`WAE :*01&![ֆV7Gikaڮ|#~F,·ۥyq%y'J/IcEA@Eb;iƯ /[.rvc]_\t4}7=:AucMaz.DM73uU+4%J0P8t'h{#^M1:^GT3/4fKWϼ'd}'oCu?WggϳCJk9qOkw?Xnݡ _،}>ޓ^>}Idݭmʳmq䖥o ;{vGJP*h_7jʗ7oɁ>gBP"@%uJ\1t%GC"Y%GLτ{DxG&wɇE9!4=hea`w3o;E >Ũ +DWQ$ioBt(0G-~M9~i hGL Us$ΗݤV~.!GJ dTuR R m.~ P%uʘpАB b  shIKP>K3EIL4'ȳUM1C# %(.c1j[o ݸ;9|7.?Gεӽ|]i.^0+zČxFCWW)LnS6ޔ1-jTjDjU@UuJiA! {DCЫ< 곗ƺk$"F+4%S0Jr|GD-+V.:ɺ%G`SK-R,CMZf1zmFΎU1[G<4Ѭu2"d/2E2RRh4MlK%2J(˂QXx %6TaDLqu5#e *DYH,y9ƳVXPC٢Ȱ5a8,*mPS@ۆbt;!URU7lVqgk&M3>H[9[lOr.vY0Xefkm*l\JJKJ1ha#V?͹(j92mE3^"T`FU9Pv@QRTk،5c;N}u!6օfԅՅ[]^Ud^dyL_ t:<=yg klEdEH(rBGmlyj"ZSYlGjl]c^JVUmj !"d|m;|Q &m= HlFA5N 6&ZVlAgb!\JlI+:S:9k_uduLf\^h- s,ccxS--)jWm=G96Fxx4Vq,pϒPaK;8c7x#Mg 1բii~J,QW zZynG%(A,=e(4SVWDJ{9-r>U|?>?0߈aЫDv<ژit׾T❪ kmO~\}=&Sh7 l^d;Wr|dUPjM *`-L'Ŗԋ7ˋjyǓZRJb`N^w}Z[qsc]P:Yv7Xam=d1ڇ`"۶ȸ](}h{ yc8O)5Ű_y#)0YdaA_CWkUS\(![BNV~@2ID`0|\pQkCيRV4#g-QP۾Ӂrkpś IkC12CD 7^b%Ǣ!7x??Py÷K6ƜER]X,$[8&yg,8BmX qOǞjz&hv7հ]?)a:U&R+*&&J:JV. aY])g$t2*RV&6rV^ƈ'6HR.} sdMF %I$dH]IZ#&h"DH=lMi2[:A-Axu>+WEdUQ&CYcH* ˋWr.͕UXrO_#O erǘeudtl T_PRH-Ʌ89lHNrIdyJ M}PP欒|bfl`-y[4;["]?{Ʊ ,vT8">0*&r)2Ԉ# ȶ8tUuçT߬ί3b4Y>ƯK[ fpeXЫ%_2i> P瑹N+ߩvwjFr/i/Vkn]VvN 'NTkO S[7nZ,Cj+W1z}?&+#{]O׶6d"LGrOϤ((+:lXMTH=txǗ_}xǟ߽߾@|xkW h$CI` {[ߺkCג9fo{6qb93Ȝv48^uQZ%يla_a#}ZhƋoHf}vB-B4b@۽lޒmctM<.-4o\QJb.X!,:!i*ZʀDoi\*ұoҥ!:S6>I70Dh &d"'2FIԞbŰ˫ˡ8'r>E5HiCo,Gw{*^ޚtW..aBy=up}u՗`X/I'e8kdelHU.lb UN77[Ͼgze"ur)\І1@.*`? r]` jU짘g˶{= L&U:ho{u"%r0u㥜iyrD VшPVO`k/GgާV'ǴshJ:ifST$AnYl*%ƟO`Kj@pYcuF'6tfs2TB!CtC# + 1yxwϚԦM$aГrsPoE9~#Qo֦2Ee'.q^d*.zjӚW`u" vh5AaZvjR`2JF"KR2FAY%A(WJjPО$ A¢ec uxR$2A¡ ψO9TQoyݖBBHlRqSn43"lH7pwΡ-rUs&~ԋW8 )cv+l]?Vw. i@r۽5AlYau^dw-׍&d-@y0v[?s=.{SAp*ea[ٚ鯛v*j1 ,}u1?>/*pDy.oA%׌8_>ܕ Uj] @*dGPųd/+06^p0Q k͔% O)**O<;}ć@okudvš_TMף+Ÿ҉  iN^qQG(Ki+.W%Y(Gpڞa`9.PM`'/PB'8b,9yf-GiOR Z'5-4ٗv| jj&kΦGy1e4/$T>j KVZ8A<|O*8wv#Oɼ;6Q|}4$\ 3?kZhь^&̤ˏ?|'R-,"T'rTJD.̜IoLy˽oKb'YMy})'J=Kb:B۾.pA>KCsvow'~}TPAc-]r6 iwUHX)dr)P|X$A$B6.4MJkP)I Ȝ8<!"z : dԦ=W$sn-ίl0;VlFE8eܟBR5G u:jyPF "P$AKAkc_<|yi=E4ɃZN[NOa ~+%2p +-(pd6^$O/'u'yߟ :RA[Ēs 6&[F,'IwGpk Kp=#[L,\__oa֛Тg>CC Qonڋ {K}M$A8M' %{ DE0qK ㌻u[Z\ 2"l 'CKSpr3}=Hh=#s}g>4|qܑP,Y~9/kzÛFbtaar:|ޞlOi:ҵK<~}.8;nťVK҇Xgݺّm29M y%ע&l2_oi7lmu)$YGnZ6Bp/ 0ߏ~Cu[IN$۸&LVL9>@a\pofH6z%<̓Q4H.G6<ѿV㑴i[R/ِ:VnqUqhVǔca.xϋyia[VwQs,:x"G FKi^2$cN)yb1,F^GMOfL߀9|x:^3*(*HK%nS(BJXP<{kX\PK(BwO"..(zmoܸ%/dC|>FR3.z÷J|7D*V׆V1׆eԆˣ׆QjJmo6aMyene<׋/}B`|JUdW@LФR\u@bIPn $P-=[^H rΔ5`gQ5%^qB"L(S`1h5M$5,>[)r} b_u| J* DpeCbФ!)o]&Q9WHm<9 Fr r7T$52"Xj#jk%2:EKbLx-2U ag>ΩSg_%mޮ4zE;E$@7#8T{䡞Y:\%/dt dζ댇Ө\YAQFXoO&!!OM6 #Q9%*8\D !DHoQQԯGS{ >H%'5Ԙ<8Ca9 ]c p` {1 4ybu3<_)W6 ݪlQjٛ7SQ$?z9>/' Nc4q"yu)"4diDŇjA%ldll8wpzgg<7!F]B)BGV[%n8R 8M,BX#؂ Qg$0s!S Ξ",3LG {P4ZJ9g #|א،fkg5i:Pidi}P-cZ t0OwI}痲AANzJXT$wIJ SN qzUZA-=I4#PRr< = eBM< (5l Ep"8۞ǝ~{q{~w9aJ4f080u$4Pxds.rf~p[7"8yrPp-/鞙om풿=&)\SH F &뺌 GJ$>3JbFͅ=6`˯CvȾaz-^^-Eq6&tژK[]Rj O D&Hcol V;wm`GG@>d&@, XCG",_~FDcN?zp05@l>| N)&z΋FIp d.>4!#i@e( <;;@<$$R B}*Bj`I+Cc 5@'~^ѱj8?yX_2ْf+. hNDvl9W9SZOR7rBȱ/`1EiΪ.>Lb 󖞻wN]/^d࠘ SlW'$lmi[G߽wٻm~4u].$*OjʿZ4_݉ߪ/_ErL9MìQiϙMdN%P hKgAIzSw(Er9%[lQf妝Ѣ 5DØO9{pB.K44&/* pA|Bc?RuFB6"RGө?҈x}CbvCr咡X2ktH4Q#uAZ~k+k掠G!sȸY4 K؜L_\r*pP`CX0C쾀ebNV6ӹ;6irZ}2RF哬8jh,Ӷ RFs& NlU[[7`v8*Ș(])3K;>vx@N&[ab= UI'^G]2WMgR9xw{|.zhF杷;*ez:fC\YH:*˔NXrmIX@o;Go_ڝhZwؗ[/V$7k(v܁o{~2p8{2~9=4~H GIFyN7Zـg >e7YjOjrdCRAG!xfŸ4F[Ӡ"PU%/TVI &2p!ST,6f)6LZwF!$e!;fByV*(w?\i&4'{"~@1ݛ`}^HP{._ҖH1/7w!lU`|*quNA!F<{G*/-=i//`^^E掺ԝک{1.^1x%1dsr Ad$');q@) s@Fަr|&sqq zȀ[0f ( UNi5 )}77Yc(m @Ңq`sT=wy"̵}GZ LbfϼV( ¤?VͬPzHYݒ .{6ZmH*-:"AJ.$R =x`̕պj/ |Vs+Fn {8YJA`?{~t> 8:U՗«٣FVMn#v_rނFF(S~4>̈́<ߤHRW%Hf ){5ml6!wl26pjddҊ11,W9!c].FT%ۆ@* ltK eHч8rYJd.s>w*R~2ٮ֝ UٻȑdJ~2bX O*V2D3B N&% -lE F!0s:pjeQe[ZSЋ#BEbTZ:a6V^պA!ϏX4^BzʙV, j&XZ>oKzJ29mAgQ,z O}q 6v+i~!IDטORsLp6Tu5&C`eIx cٛ]c974sgRR5c<azdl ue]z]p٣2eA1mLũ'ד8N'vCώYc)",NIksZ*1O@ %ۊde"4deB|L (jSj5 L&nǬ9bfqe*+;Gib֮&u{{orLAl8@8C&dȁXthHZ.p$EJ,F}e}X;Ö/4Ÿ+|lMehzk$ $z٣0RA6yc]R xH" FZΘ6@H6\&"=RPF#=Xj՝ R @zq.:IɶzU֋׋^עMFHyY)Rr14  m.p k&z);kIǮC>| L#1gDn;keI^q42yA+rd1J6o@chɪM.U/'#Rdܖ6>R '{2d+)i)`+m5Le-$ J̭l"\>bBK.gk+1#r!X`Z`]୷AJ:,uv8'ՆպRοb'f~2].FM{YX&z8>>x-/i]>r0N`V!b;5'k\eEr=?ִBJ6 a1;)H.,4/㘓%psHbY\6H兴ը3(U9d@:Y9JYADAyVԆ&Vd7?`翣,-.=q%ƛ>43#+Vb9`|Fb $M2kMe1L}o^dٟW\WZSI'"5)ap-bb3.88gdžHO&x))f~mjHrt{Z*Ri>rv~i#j-uzp[E2%o?D_ߤ2ч nt5xCL;-rh%'ŻJ`mLۋۚ=c{7Cuٙ3t'fnW E$j_.mz7'~!:ZtκH̪p4nu`ŲGDOn<=;78V Z+ihǖN.HiXc|ˣ/'XgXP=$v:]ǿۣ|߷_<G\أ7sh cGu@Gt͛jMS{ ֢~ݻFZby{\+RrQt̳=)?7,yc?J u,Cd +jvgU!0v@.C9l˫^&>/-q\&3a]@fcJ7Cv@=fVO)Ό@*|{=?烿y..O~:E}^dدY Ս(&z2)D0I ¨҂!@O.0&S٧L[:#2YwN6aWVDL^ͽm}ńnœqMxD4T6193QJ|+8d!4YhHR㌂&Z+B'ejUv[*8U* Ț\/Q<ŷ掠G!Yde@~e6!fS>Er܅΄'8IzǗ" I t_}Ws`0X6Mr軷k׫FGQN/}Z(K5MDVX@pًz6VSޭ޵6#b1HIVR"^m񱽒@{ز-dj{z)vUDV‘ٟoiJ4F1 Kd 9NR`<8hňj*qT2YAfXHzFT{G6AK^@٨SUC=7ض=3#9k& ? _-%z6X:ubt]0laq86ȗ8h+qhHl;w 2f/$d9jʁ0 hk@NGOXP\ kWn[%Xi 'g7ynF&u cK:fKl}UwJoI2K("Iv&BK^Vb 8 7eD,*E6%':794)Cbҍ'C Y#ǨSRzBrQ1$E@*lVH$eBE Ud ,ꂹt!e_ YTHEF+eOCsZj3pz|f 8bWו?{1l B_zBO^Lrl-_B}O6E[); >CQD diT#I]id2Cπe;O%CVȸV) xVE*٭/5I+=\{5o٧Xgy"RjﻧcʽFwEIS-@X:c4!R2y8ь{4e`-pS*M Ċ.:Fp4 66 @EUbQG% sl<Z@Qg3qvcO =&i+r0j[Og'(g1AALTPs9GT'+M^r^ŗ3 ;,ΞA{plt0&-KVn 5Aм֣Nh/ ȭ(1f]U!Z5|_~Hsjq賎ʩ+by5ɽQOP)mB bvhXgmHx'BzFIaq,B˫i<00dVx*ږb*Y2heAz#W@ &YbS!m."ɤ:F.Y9Y(]6H9EPţ>\3q& WuH>^ ݽvpݔUF kxnv5tvޞ FdgCUCa߯f[zz#p#r`*Ր:{;{u徽?G=gcm>wwEr_G (v_UzwozP}Z} n5j5jnOOrF_}AKzwuܶxY]8g>*_1 )e[N\2M,"l*(=A8^aeݱdrs o{<9#6#`w[zںqu#t=&WV ,Ҫٚ 6Cg& SU%̗dU[zJl.;mw16*DicЍՓ[/|3Pa:Ԃ/e33"yH.20Pd!/E(D0٧+ GT/`xM. )MKQ,Y!,)TPH!Q NFѫ30n܍U8UiY?hl Goq3Gi6Sbꔈ|DdMJwT lԪZ^jViɍV-K RMjZd0h%1x'\/?Os: \4 T%^vYb0 Un"ThTcx"w6}:$QW4ұyZx>\cI= :] ;Ch rJ8z %'0{|ssJg׷oUDz oC^>sϤ/|v0V<3nk*kO6)|aro: kd}S=bݓmaJo8˩mO r<ӲΤ1 hR& 8fsblwWw\mz[>=9]'r2_ūafjQI42 v@uXu>S` GG|(Qi9 Y~ uJH5ኡ &Hj0M%&# A ڠ) +KV !HU&jl 'L݀yP(ZxP缾\[ z6Kv8\Z,9l˯E]{vwEA!2 "]"dC VhOӜI"k|Ipb`6ѱ6IF.mOvwDRRgK&-rqxQ|ejSFK"岵$Ĩ Փ 5g-cR3y^"cDd7oKTG%$h|UX&H2.x%JclpW' Te ;}O2J zDIl08b\A:G(WMPԊד4I;2C By+Zc $0*g^.JJI}_p긬ٜxex7ĚH:F&vzR^Py uHE%@3`E9i[,w5_|J8qʭOw]•ne\k6R Q[FBIxE.\FkVʚ-<jQMi*-Q9#n>2BdMU'0#m2|W{(ěb7')u5kj6Yϖ9_y*KQ1+_ W{刞{a}G)?<=apج˥9;^+ `FV@p7!P ]q(JAZ& D2b&Jȗkێ*9_xx㇬*GH, 0_"[X%D 1Rwڙu].4/!ҡ{#?e}Y}t RR_?)DoWJnecIjeQiwO-'n+QiqEZQ tol1* S.fjMwhņ$~+;[Fqњ-_mWBm pŔV.&Qm:\<bi4jlInF-V6[}\,G4֚čIŞCƌZQH-t'ȥBQ̸ùdEOʛ9UoƐFl:%'}B#-9~/TRKо6gmb3=6>@< 8r[1}.r*qǍFUE|%#S ל"~gJ[sMBqu}ၠ~/ٔ\qڐ (YTZ S!SO77CZUlm䄘1Zh%LVp>%ܨ3K]ڏ2ƜKBCTzLPhhM0Ρ= 6m.83XR2P;#>dˀ .ͫtk.5 R`QO6Y?/g%q1 CK$;h,Ay f-'dI$P ԑ Y:4;з+ eˆwh2Z] l!l ŀ8l 관s4Ko3]iPD%!WcjIt(e k-(Fm k, qOY2. m@5#^KnƳIze #`Hє֌> 5$`ю\Ĭ>.`Pr\'hc  P% v`[DQꐩ6 QFOj!C&t48DSn \`VR,ThR껱do`L/|@cE;B3 Y\F{0!ЮEf%7AwM?bw0cB췱! `g,,j8 cJ 8wá\AI!1b(AR&u66Й|6sRC{) l0J8M=n3XBSG6 RH6Yg_P HqRC+ٔM9W7U28]{kHH(f;`))zG_z҈:c6BEs򤏈F;Dq=VjrBPQ5P`%Lp jVtWZ Q\,(Jǀw P\C`sRtVeP:&u"@cQUY#$OIhv =!o3W,$tC^ 4I4DOU}(ͅޑ0v!/-S5GŒֳ</tn;^:,T]dA:X?3H5)sP/ ӷu.}w]n ?]An(ؖA\C+x-F6F~vi3 z1U@@Q5AwAKrmeFA/F vpNrDPDy&c; }@%nAz#r\@ Gv JJS9c$4;Fo$ͣ̓;HF}!uQ` (m+2@!~.\" ):`<\JJiv2 `E`A  G܆p, :C:ud1]9 (3&%vDH@i F]zӁ{BEd?6ob| ?U_.]%jkCБR6>=|wxyh/ӞkML(zx zGtypt2{H(!Cӿ_ > 6JtJn0AE0B)6N1aoߍXv*ͨ\[t E zkjTC1x9AnTTht(#DtS]Ggt* A6ts[\S 6Bz*qTO?^LĊD -*C7G@zr]"ZIޒY.dZjB"ƨ҇CR1R9Pq郡A-0g4p7h pNcw>'Lfjv=ڭXhgKLVP ĆTe׾Q#E;U W) ҅(1 >SAo}vY]`)mQK˝z]^?~:Ws:F%ȗG~v7ϟ߅c<8#|HdM.Hu)y5:5R&x3xN*Sy|;hƲD_i܃]DwW|0o~@T"^V?Yh悾$1 l|ko-e̷2Z[|ko-e̷2Z[|ko-e̷2Z[|ko-e̷2Z[|ko-e̷2Z[|ko-e̷k.9o!쟏-٘o!ZV6ֺe3o"B},;䀼Ņ.|'cB%+ƣQ9Nַ:3cyr:n9*2KУi"h olA3ITbque0O=>cs¹A~JG?9N~p Ay{Kb%rav)& ˺n]t/nAQ7x=0BgbN BUFΠٮ٧iyo'7M֋c4$`nwP]n6HvU3j5 ].6s7b!1 jOˎib2>viF64yd$_ȽNY :_=sHs'FLh2  C (@MFSL4s7Eّ=DE<Bą2>tzj4cTR \"A=GB0lC̵םp8|pFďo:nu..)uN˒beŃ3Z)"؎6bMj<_ԋ.Z]Ͷ6=^J |x*xXa׻6{=g o<A/޽y/僾|ЗA_>}/僾|ЗA_>}/僾|ЗA_>}/僾|ЗA_>}/僾|ЗA_>}/僾|ЗAC[|K>>襠Ÿ:]JOA }Э}?_xE9W{] :s؍_U"mx/O=98"J8p7/>2W^~"C>{S8#5A~^b J:KbCW4C,P$9⒇5-sp~Oƣ_Js#}Ͽoʫ-̮>GO_ܸ-}"}@o+ AY qz'1`%쟓q[(j1M4 ޡqDp_[B'Xu_zXd],0>>b^:N[$r1Dz^Z} IMstɆGC>\:n=IJlX# !VK961={m48Kq! ִ2{,aH)ۭ\4[rgi&?w3.;K6?{ȭd/7Y-̇H 2,kHxE[oY--40)&UdS:ܱ׽I>7҉yt7YUa@tC{ke*T] _Ig0ӟG.'QW+~#lx g2ק`N΁3SDN<]bNxlt<%xu= ф<X´tH]ý€YETYIcq1:? c:e.34"o*MMwr5~TX#5*M5z,6 \^qxvGgKS9L~iV3-nE__O.f[5 #% a^Rە"jד9݅)[PMKږU]3kY,a!P*#F1䃲;UosնJV睬k)%Z:NldaF*'> F.U}=;Oqg0+cU\ M+^tLJޟ}M߾?D绳?| ?`&pZ#@"(}w͛f5 ͛5hZ6{=vM匫˞[qDc_%>9aui.46/[dSD4 jzGwܬPQ3mS+_6~@ݹpEל92m˫>Niݚ'-98JI,Ȥ1#'*,RAsg@22 g0W'sRTC:usxOutJ(?*ZasmP@7SGu Defr䋝V.var=<^^ΰvZo^8VxӉx]ٌ_#Blƙ8g6fٌ3qf3lƙ8g6fٌ3qf3lƙ8g6fٌ3qf3lƙ8g6fٌ3qf3lƙ8g6fٌ3qf3lƙ8g6fٌ_-1FO#b3Nsu4lI\:qfٌ1l¨u6&{漄-gS;~&gfD @ m؁H@˜6Ԅ|Nj܆< [xxȡ(=wDZ|k{ XmFVCƘmxp: RlTAk7L\ހR"v:߆A |w[ܔ5$6a #9&ku}{%`VxeM3q87A=eD~}T={ZCۃ1} >!̯ü@ZmO[ zF[a7ی~!5Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ댹Θ׃Zf|VP$Jp, q --,@A`)"jr4q^:x8HIEF8є'yݭFa[@BB,Zm ƀ}F@<Pq-:^oɒhrSmqBKikpcq n2*"UZf[ͩF{9[ڠAnE@LV$|0 :A8UP|*뼅rnW[Y+O^{/fQk-Jt.4xY1a{B\`9Mg6)B^,ޙswX`zS$w}گٽIJ2h^V5+va}^OSO^ E[{vL,&?Gp:pn:2v`.pSF=7. W3GiJV$B|ĚsLI`F]% t4IZrIJo_bTpH]BУQWI\%E]CWWIJz3>"ukv<*{<8CWWIJkTW"usvUXHK0=tu$:W$|)&BJ*}, 5%)իTW K[鵓*v34q$^FSF3T˾̓4XF(ς'Qd B(.4HtH%CBs'謒+΁w=Mwi7=W!Фn~◟[yv:]N7Ig%퉈\zo}dRb +Yd%I*3DpzK0J n<-~Pvz64 0ռu{]ٸ}ua0aB31GG^N?zcHBc$C I/:j)JW8 O 1)уwMµG,~?Y-斪Uލ"w@i -1BXrRyoV[-- nA}d|>rghI?BTK7bԭ{PL-G'>P +ldp,H"uXzHK-"b9bjFSKɷӊDl@-/%K=w&S5`9,jC)Aĥn]Yuy%l>f$m/& EcKlkέ4T/dRQ3 ?E9c):!ТHqrB6X_ SAғ9w"! b,@hq~јsa9w(*i)8{8‚7P"9kM`B<UX-zg՘I2b=6M x"eEo5qF(Mume!ք‚I]` owdn0 *\!moAHUՙL6{Oճ]yՃᤪFvHg S]*nd:zƧ q1xǫPUSbZzw0Jev{vhVBJڼ]{e+ݎ7y0 v7;$p&s~Pf+WxFQ4ϋ6a]P,FzOz1m փi䴯zn7!ߜ1 W Ro?7(zxa۽B `{ҾI5V؈= }`Ql5,tQpT9J k9JqTQ*5"va;X+\$JCƶ3N&9hbMCR4?&T;,q/ۼTrY6/i=k|dLL.8&JhI=@IZ~IJ9V_#R"tYm`"`7RE"vǰd&CD{f ,!,s$kڠ5q`hYln}pKz`ɻQT*%0(ѩSMOE"<\`ќRlb FD- JpWN␳iG F UOɚy 2r f|"Oc`f FILSIFi fӊIq47Ш/䁨}aq(:At AI@E0Z1 c):8zgl!V)ТiQfmOEm{v={sKs![Pi ݭ̐Xs5d_Y?\CTrqCkhx$w{eh 1) FU*7k1-T0BT^H;i0 c}((O=Oi?2 *eۙ~MFM xh\h7iG!(ܤt:֫'MdeW? {v֠N.v&*Aa =m]iF_;ܵUFU֭j\T;3R\YS5k:WtnmU./HUή۪LZg L:$2h6)TeSeld$Vk| ?]ƈL}LYeBGhe'Wq|| `#Ud&v]O͍- WI&t[*92+nheżZ+b& VVw#|۹wCUVnÐvEjwbSΨ{e7jMjQv$x 0޵4  jjfnM|8>ѱ},%Ev,[)tV7 @ÿ<< YvڬdBz?^+.]ڧܵhqqõ[^=*l~; oNߢF+|0S6~kSp{'g'/'7gx-=#/f.@Ӂ n.O=3_ρt%Wg-#<%Њg ډwh䜼.,ٚZptmijeo|#T#6=/ ճ/lǕi!y0jX+ JUl}AD9|VΆaARjU ,Gӳ풐[<GXu)G'*meʐteR7ywܾRwWо:0p^S9dVI)|TO弶l~5!Ŗo''w-/yamS;7?|~|gCIJZ7DѢMGwcXWv|}Mz|ڠ5z0>tk?r-^㭨l 7o~ro 9ܽא.:U̾ JipehӀ>2nTm|4*KT;R|@RV>jxl\Z:-0b-yӖ¦MQ5rI_gbXׄšbt15'$V$>'-aI( A<_я8O1m^tYc_g\^YFۗ>"fz\=j@bikN L1wS+Ho}`sCth"jpGE騫O4gshɱStOt}ؽ{dlv5PB~`<7&{L;O=:TQy@qkeTl9#>7w^D"E̚kmzXUVtYk#`!P9Kɲ$Pʤ]w齽ݽNOȟْLy륒O;^x;777뼝&=!佔fST ʿb 1Q jj6tFrlj%Q7!x4U0MqJ@ۮuNk]Hsj }*CsɨY"jCc^ >([{;F݆ct C2{rn}G|d^>+|/׎ syߟ*}zNНmy+ׇG%dJ6 Tm=Tk`tp}B^o=41 V"@i0*V<̮Pګ?0?uJJ'r(tw^.a`]IRcKj wwoC͏~YuC3!u%e JK(hE l\|fJs9~ؒ8\_ԣ™C;E\(Bڹ,,ӳշIq]bXdoPRZ+M@؅&ډoڲrAsv F+#&#v CoPֈ ]DEƥ]6q.>B*ypvn; rm<ϽލÏR 7^2_x#淼ܺN䯾jrs}6k^Uy<|u=)**;rqO˗GykͿ p}u9/z.uxOǮ=b*-O̿#r??]^oKڲYrBv?QN'桚b* DW@t/X+I/^4Ia.mJdwax=l݇y5˻6oVҡvob /tq P)4CP-ec*3SW p DV p(\I ,lNK5UVWsQʆKàM"D`CW[]5Qm rN4@Ѧ)Tmkֱr$ɱl|W츽SKH)OxL/[ĨD2\g]”wU'`Z<-fg(ҷ $g-wPRqݜn?MҩcEeAaꬨ݆CӫE)MKxmTϻ^Q}l)i=^"2y%[{UǯԐ|%[~>~,G]Y[+/8+b{CHU GXSNA;oHѪ 0sq}-Ps̪G/cV%}+ɱdb(kob#2vӖ}ml d j #jv}<- C>zot=o' q62;{W%V,9yOueAkn/h2.(NsI3 ؋(lif< P)j T6^[^ybyހcݤc,VOV{`mfR2X͂S}%E NY  IcE/vlJ!&FAxE'[q7q%DeTcg{m8O_958M>E E,c5 XL=~'|RX$xEGݢVl\XMmsS uS$m (>q^ف Sfa\cuv}"u8.>Du\ɑZ$6@G 6hۦmĠ ..&K:bi?{xFRm~]XOFbK"6*ywH $ަ@Xv~\FSSd1|I-jinGٿt-y^#9t\Q):<*Er4^t(CȪ3x!dRRqQ[KpRkyn8׋[X4SyLJԃײINsʃIX *Y(4VgEh&;kKS,;Px$\HNJ|@^e *q0p͘Pe0w'{zl&TIN41j ǣ.֍zG-qP|xLDhS> <( DeO&*)O޺:2K^`/5bMB}rTӐ \2W5R+ 77cLH )Z(yN"* (.3MBK{cN)Ky+ b6H_? LQ ^%gάgb.t&mD-ōq@9^^Y&_ߗH -wE2%hoWAG=4<|:JawMQeˮӟW\ʔjf{}͊cFos\ٓN[7 5Kaݧ0Wv5kE8Ն?NHwWnA~hIP%[ ݌;Y,Bi8jt`Ųϋ_:UNVw5V'"-HiXh@NP^9?gg7]RcMR=*UB _~闋ʟ|?rxw=:Pw(jUpk|xZޢiaoiqc=B$(ϋ;\v`W*2n{q ?yF6_ĎvױQ!m΅˶c6?y+w2hoST TJCg@8CSp or' BKgfs06dEM]Kh%ܢi6 R ׋V.v&iz{ӕhēpI!Ox0Vyg0ӝN-vUa:N .gV'DsnN+թ@i: $J)4RhNH]rvvꪈquUγ)׶ت^Cb'P`ɨ"'.RZU+TW +iw]]q9*JuuU4WBRW(08E\%OE]imJ)׶ت^ҖZ}JG2tXNF]q9uU?j/R^2FM@o W7i(V%n-VbaHV}67-p)rz¹a`2!IX 1/ >ggܛ,kWJe2f]>e`̛ZwoT:F3hPc*k-ŭ;PTDϳAeW(Wn{0GM\4݅-. d3vĉ64_Tj0-b!oY&Wj4SOqTPjCrUʡV9*ZPjCrU\ʡV9*ZPjCrUʡVvPjCrUX9*ZPjU/VʡV9*ZPjCTQʡV9*ZPjCrUʡV!+ZPӿrUʡV9*ZPjCrUʡV9*ZPjCrUʡV9*ZP1LGa )P q%?`")-b rCb]qGⳙ_)-ot+6S"Uĺ ֺ. :%`da{"GnHo=%(ʧRu%Ǽs6% 4pWjB6*ZFywzpۜeݗUZAnR| FdWi؛bbDS/zAn ~_X}1icPqG7,Jv_29[NYz[_pF871u-rSaO߂z ; E+YՒG~J cEȞlֳjDY?A;Az!:7h{@LCASt4X0*.S4nTeВsan7[ɿ.~6fr޿3!nCyՂWt"AQOЌz[Sj8}afgjr*)N1(s%;jIZѧI*`-$KɸҡR Ef`*CFd40Ϭ1&d.LZi*rۆQkP|>ʹGmΟ,^&%"7B.73C.6U,MtRJs8ye?MX`I2Woxռ*pj^+>G=- \9?" WW>P$0\tQ( U xf,(ʺV}#FS.\}>];5L>Aí!ghRFC'U߽F1F1/O$0^`M@#A c\GҘr)Ѝ<Fihq2_J;͸a.n7}8eanM3>DnW2kvKZ^l˲}8?P+cPJ]̜S5yfFel`J wV%n$dep19A *Z^ jg}븸yWɮ93Wr_Q=xRfu2'3L͒;A"F m kNi dʼnRҢ"N\=v{TU75=JAlf8E.LI: aV o2tzyPC %S=,מ; )}/m~=P0?qZ07jbJ'?J`̑'VMwWխ{{[k-xa9$n@ۯA=ưVDMƯ7FZSy{}x/=LNhd9qnyS^JtͩIFPe V\o&ٔkyS;z(pg}kѓ!Ѽʹ;ƎOv#fٍX$A}JE_%e4 Js =U)s4C u(i1C_WuWWEJ"fr.Yi$d1;.,QZlq@ci3cwER!dAC*Rh0Q6LٙB &~| Kno?m|.VDWC_LvgTWo2x\$흒HÃ{%w\N. bG$YXQF#pm<ۻг=K/Y%9Yvh95N84dNd۬l֏Nψ&j~iZ옳Cd }OT/hh*|_&"8M~12)TN!PX0"9FAzªD) RA;ND]La?)4X?}sJgavcC65y#[4+2 OYͻxMZW}oC#`IAǻg j@3"Ɯ1%_UNm.י>C>nqsCZ/?~{[/.Y\cȜ? Xl--My?ϱ˽v5N]y~a|{G{6·pfns/>_EHO.yCO:3L?! qSXV\> ߓ'|&댑F_{W<vKe˩'Cb]Cwiֿf,W/<$iGetHoXlKY )Q/M1,hXpE E iWt`o/:h#'__ިA&E4I{$+җb3#H:*,dKР7hr- uﻭ=b[}쫻HoVElz9]8zVUባDYOjINA)4_-'Ep&;rӹ ^jBo5/I 1Y1_!>P"7Zj}}(U1AvrL9mchgh0U11(t˕7"GX2dkwr:g.G)]W7ߪt3vGQ De,>OxQ]8ҲQVh=1jlrm2G 6&rq >>Z'h̻t鲎1Ӆae3 QqRЙ"mJQ 8oշGxJyo.mKWD T9d\]P%l|(0ej0q=(Lwp;f`3wt mҥvY+c|N){u8)%,h:Bwp!{ʳrД#8@ 61(`X\-THx &gyW7i+X<3P~+̥q9XǢɠ[&䒇X\ՉbxCNoeAO= x7&I!sZؽw LЩX@©eAZZIOz>.zFkMVpM(AxeTdY4;Ǩ=߷Jᄱ, 4O贁Nx @uf00dK`dm툠Q?vry-YVRB+ÊvA STѮJ_לQ9nsmz 7=..n>ev 6Bnᦸ z5oVoyQVlZɉMG%U"*w7[y O}|<-?1y8[=f@XW-G ~[օdJ2[~G@-]ĬCa6E}.FA?C2p`>uf˺2F`VpUQP젚(WJQKPRauCIn ʽZusg%5@>} k4gAsCJF"ηZG%:1lUa,aj*$/2IDK\҈ :\1P\g%fcZmͣܺS"y8|_/޹*ew,b8oc8fEZz&q96\IǿiA/Cxhwk L t?mfryoi{RcNN_.fW{oW݃Ym\"zEz/p`U`]PhsuAD &~Á \mP>MKh!-g.gՂ?{}摛ͧ}C~6i9$B]wuCun8qNڿU`W@f_ؼ󠐵5Sr]{Fz |#0TvXyOV LӘz SUE=$9U%VTU FTub +!Y Hact WSĕN VPa\\ :7v\J++-s"\u\ZtƎ+R -v2h%+ 6V+kWv7H JxM"+\cﴽWR7jr k] ɵ >zTjpVp%~]/Nړ[W:UO洁 =՞hJe]I?puHs֊pݩY+w{-x9He WSĕ jɛ|d9։%u%rƼRSwݧDј_6Fr -K2y>{d!tEdb:gt-Ջ:Yní_LǞAquRЁws>0/Y]!z^ +_5 6VѐSkݣ![ͣG# S U+[~1&ǎ+Rp5\j:BJԳ_LrAՂ+RkqE*"H+ikrQ0W8$ZpEj菷Pغp5\i-$k] Hn={Wzup"&+ʅtEBNHr\Z=z\J۬)eM{W(8v٨Ҋ=E\QOQz++\Z=@Tt ~;{v=?cX8y~;uO8q C?p"몟ʑxա]-WVU+P"Jւ+RkF+R&+'GGMME]Vb{ _jŋn_A,N0f˺2FY$,.aP'UZ;SX͘nOiq6p5jpErW֍WfOWtip\\k5r"[;j WB[pET[H4j9i0\\^ Hcp5A\Y& W(z"kH.ZpEj;Hi"ӊ״Վ5jpr &ԪoJpvp%zq[%ذS]{=պ]Riv2(zJ4\+=@^-W$WZpEj;H WSĕZ vjʂ;]Csj lѴz vd:zyk呗e)*hP[GCrţ!֍ݣAfl7IG*HAQP0pS Hj;v\ʱ%=lz\0يpEM=uXWT_S5:`弚!R+FwE*9SĕAW+,D=ɅjpEjOUlJ+׬i KU=u^jTJpfp%zy0~^9;u~r_b7pO S9sdա]ϕ#(KvZ[ Pу^+R)Mq$:hM獋U k;]c)ದ jOk񯟤?Sj̈́W$ԃ+[j~r\J&+0KUPh3j+R{8TqիJp&JY=U\Z;zgU &+mr"\׸6HW U+z0V}ybq36~ 8X?:Sujo?/9\N8kdVYkE-E9K*ebYkT5(x=^,ɕ՜ZƎ+Rp5\94)VuErW֎ߺ"m z8cNXu߾'NZsM~*f]A\Aա]ϝJ#F]Wc6)gaƨ,_a^Z'SOJi[ *kY}"Lƣ!VѠZ=R9+0ͣyWEEBVjpErWֱ U: W"W(IS Hc=ULy1p5A\)末xKql=[(买Ub"6\MWkfjP] P;ucd/ۻ WBM v'`Oƺ"j{WR7jR kQH.Ts2Hj;H3 Wĕ&I?V+X-B~TʆgZxF_!xv\:z=p};ڃlØ%X*i*i)WZ@#+PxG#m.WI[DĆU/zH(վԕ{w>m\m%=`PlWǻU֏ `Z Bq *U&NgJ]wtL ?d ;4'[؍&Bn;glcl@z萌mjè9fs`"C+kp&U ˱7$V\YJ76Sw>}XԀ&}>\|%S֒o%8pA&P9!RJ|WF9U3{.\>npJR \u6@SK_a|0s/J L#]_|Oon]JΟCC]u/-Ѵ4Ϻ%%6CJ/]e/yU,mw~E\,o7(>?xǔ!_q~z(N<7Ӏ1T)b+t@Hm'g)Oc.*5X@Uv>2~3j,)n]b) } SY=A iuyt9.ar7\6,\(>AD管) 治OC!W8t9AchuQq=Z,{p#`l1NƵV֞ ru.ft/qy̯ xL.U/=ܘl*/)EHޛ\HL@epв u22L.fo<%fsQxo6jJAxn?2Úd'fMpuE9TħX Mom}F')3{PODçZ2H^q9}QM墴0ě̅tɝN/?O/eSxo5&L+hY 7Me=]険X|?@&}J,GZ K r%- nEHɎ,-#K' sB0^ɣ>NOjE#z-te75@xd"7Q561JGC@t; #8d]FDz'C0!B*Caj3jX$"﵌FMFSOHKJzguWppRjܲ]·FU瓛!.gMn4-PMs-v|K)Y-z=274\IY?l 3Qɦ}Dt o.O+Pj+kU+ 9Lv$"^tY=P~]LoCsH]Jש׋Y-*q/J !eXbɷ n y!z{yevlǍ{6lKTwƶm)x4g?K^m$\,Yo=F2|n7MԄ֤a"zϳSUyfA7mBuk2^7Ϸja7+#Rоubsi?6eCfX{(Ι,בa͌Bcl)=' ;Ȗbq(펺fg)V,pDs|6u?$; HNGB9n5؆^{0rHRIpxaR[71Rcib pg]=aμq$C1"1 R\eM yRG ˆ {˝a"a>ﲦEb5y/ !5 Y4,e7,fTg?~b2BH_c٧|NG$}] 䛓{Nme'$MNNOkKp^B f17)ɏ`b0Fٯ^1|\%ؤݘ (5G%1rl\x(򫷤*[HzpUo?8!9K8XͩIV6#/3fLɌ| EeVDsURf@ @/65#gWL" ;;aV~foSbr<"gWPZ_ OYE̒-(h.^fvDOk,I_ -N Ex"g2\M`.za5.`,'7h"SsTP)o*4\po&&nKVy^ξ50t#/!.蹧eg+=)|^(͠& .+wǟ rparZYM2춮kSt_$X,KIܭQ+N4 hH~5Snr#q譺3vCݻ7ۛS-bMP5H-` T F*%#1-0WQ:p˄֌=vyĘ F$\.%6!`BLB1jve@ּ8zIՍĨ`F)C(Es$mHIW1$ƃY=0IixGK,(x RL0 }P(bhkcDO(P2Gˆ>J#BXYL^jʈhA2F(pHqZ2slM 'ָ|<)q|Bg5dp(𵫗}HӒ:/~;v`;HQ"X/t(pG@< ЂR9jwV؈= }`Ql58:rpšS<;Jq4Q\93-+" cZ)"V/}e=[Sq ACb15cevxidYG?_~YCGG1D1qk}Zy]UG 'QdsBSGD)}G6_#لFJZ'4{<3a;2h\+Ty ^Y;@\a~7 EHg"\ h8csI"BѪEMG0FvV%(Α]T.Tbs?Nܳ`cyw(f?ә>v37/&wW292&ZU}SOT΄ZuN+9R{^M8-e5!HC$|?f&}~ą7'6ݜ _dt)3 zt6=6?wN^dyU\K{ގM'>< "z۫ueno(Wqkq*;/z=fS(X`:#Frs.:!BBqȟJie>:{a=p/m7tBؾ,φNZ$LiAѮ R2H;F5@2tA|m.=ĤIgښګ5Ā`[B]ڦ mBWd KDGZS^:Սh B@k_yZ>j&n@W} 1(-9/W6V^1Bef׾Z˚ߦMmy[.Q C^~@uR\ WLH`f af{r+Y`gHlGh 1@0!+!!MjiD Zm\:PWBcJ+8]ݩ"xΚv)Y$ &l _F\w3,̘ uv Rl $}P;(RwBb!0qI(n vs,2?kU֊Q 4, RŚRDm`5pJ=p'7ݠOe\P%H(TlqhRgeQ%s51rwT=}[:Rh~usWhօ1?v0l5[ ys :QwdlQաhIFL S" AĮ/R d$ *0Bjkѩb#PF=Jk׶1} fIFsm] h.\3qv^xK[o|Qj{ HdfXpgGY_JG gC1^u^Zy@Iw\qM`O:%Be8;’!yEoNWTf|bJ'-xw\^|6.o(Nnc]vgozvgbԀABk\9^;MV%R6Spݓqu>q ::r9]4U@;P0~Ba,fQpE%2yYY҈-o[GWo£y~X6.1h[GGl r?/=SbFa>\u;??Jpi?܎柿ߞ}WOg{݀T):*tQu>YJ)H)|f~G~~⳾q3@(1\ "!.:xE?N891 t.9\rI|=a(AuiL=@/P%]')kiH m`_<#ߌ1u3Wڱx?]s}9giQR'RThFbESJ 1hp@#zOz'^IW4EP^;$Ag4BXIE-)еnqV;ݶ>K*eE9ᢓ[r9 "u2DUfe{"K?j~ƵQEYf\J f\*A(Q?FY6lKX iaz%P1M1ɬwҎSV@> [%1l|u?G&pZApCD?LǥiT36(AȲĆh+/vSW-给|G)Oǎ&7X ?߹:)A S|pE&E:t,B 2CV0҉_`%Nke Jq1ࢮ3RoyzYjpRkL=5Wz~Bn&O/7G#{|՗KȜNBXVF)xh̴`%"$|5]V8e4,:e!bIIY)-[4Pj̓w; kB|'<5}Yv<#cGR6CLQe}`bbs*d2: F(> *?{A_jjǐrIB'"aE*^kb+(B.0&).K W|CL<_}bz{V^ 뿟}}O~I޳E

XS0.n73>fB }6kt&:oSplJ9%E\C:7o~+9%Pu-Νf71d.` .qEy !6?Pgo^Q|P@f? 'Y\<_\Y W;>l\A(MR 9znw^8bH3!-j W篏#OZF#oσpanZU;)o-`.q|{KZǰ짶rVg|Z{hzZ$]2t \+fVsfH |pv:aOQFDs,={+C+K+Gh{Mg!Z?R_%ԵCN;/{8kunY1{NN =$ڐ6Kqԭ9gc3{W|7ky@[&y%ҋU?|o6xo{c.@mgEh (kx,JpD7`tn6 el|O{K.ц@F(ϭIyu] ە$^<7NGſssxtkYfca,U*KwnrttQ?,K 2^L;``Y➭f^#7Oj|:a쨪 /GHEdkՎXoᖑ~e(ˇ.Wؙ>k9ɶ2mAp9҆3%ilMk,]Dn2k'݅$]I°ing结I dGH4(`ٚbYۍ&9Փ$FBv/rLVꃦIHG/ɫT[T*V{Rw;鑬cFgdɘu!gk?I/9o7GMԫZSI{k%h(%%Ɋ-Atz9)!K#LV̇%J]G$6)hK }e_'Z)|@zijIG)BXX'餞.]FT,4YD ɞ5)sSȘgE¥:jt&+(IPkCuYv ӌT"yˌwh|z`rJcϨ-OXݫ N6` B? uԁn#Xe~0@&ܪCZ]%W${pyֲt:Ʈ&6V:.2EG `w+GQ6f8*q^$v y [ZpUTiRS`Q1 !]Mh{48pqJcXXd"zbJe5̀j@o]\Q?A2n@Ơ) uڀb(4VH(,b4v#]7cEfJW5gAGEEqv0 }G.hQ14KT@gSO` bXr@Vʤc@8"+4poSCM;%E9x1o&_4VkD5{RPRe+3*F-(U&伖( BDjJc`V."2@b0۽@VQ=j+`"КeM2y vnm3RTiĬ⤁1&bbtNR!'DhUY;73sStፍ@Ӗ8Ҩd :JrJ%V 0 CMZe&5:#.3(|" H&jZ5>xmBV5 U8nxFi /!!d<ܗUd 2inu!ǣ@܎m}KW V'ES U_~^j~gE3# ld-Ő@TD*_nmkU{.N&!z*hIkVxIYPG]Jr P _@1`rYwk)f*H"flq%dl%VmZb~ M+4B:gF& hH `f%݂ ZMުh{9ExGoR Q$4yhM-J M ڪ$qc=v#gv0/j|MIYuL FP.tbCJ''40z63 ]5໓(٢m[SQ4ךBr$JVO ޮ `L Lya#ҷfĞT2:%<%*`rh[S|'Ѝڪ*-HדvJTEw AP`IhTU")RC\a뛶] jV"$RKLFA%A ap`R p#X("#:z!Ni,+~(] ڈJ5xQ380)`cM[,8RH 5i*M3|iΤyha3BBv ZgӎA^IhDoZajlJ؝5HL)7VP8iU8Z6 '\ɤh^H 0##QrR5]A=u@e(6ItW׈J0 Ǎ6&TS.*w3 H&!C1 4&ʥ"0xS8Pk|X~B:C+zgjp!Gz-~۵R#" *q`ZqiaϙdOc+\4WB6ڈnf_3yҊ#X|뀲7Kusn~\eT:FКR”k-Yw\?,=KLpT62dYaŕp&uI0'wgDwj;<*+ cd1!ƫ%>9>j .Fޟz g@t>_?a_y_2 <7uxF[C 'PSx:x@}wBgT:hr6/yiJJ^!ڴ.zBET/h-U}j.L,эү ,6x8} rb/>~\}97T) (6ïF'WZokgo/wp ;/C~E B氾Zf〷 F؟ff^u[ݶV}mb^vHO&Zֺj4m6V86 ʔ@2M1WQ "tJTs.X[k4Oq5Fw۵,YV)LC<>UO6-hҿ`z -FAl&Wz0M̭?"؋yֽ_Kn|oUa5%^/_dƵ:W%/s;F|ݜG>^]a'W.D0cۚxEI0u<棟Nf¿E-7Yie& [竵ªQ~͡*lm_q8 xsȇiOz6+\EKtwUpj]ȉ>WaUޙxJ !;T"GT_`,8ﴴm?o3"_-+8||;2x`mF$GH ,۱x)q:or=X:Kt*m,(G b#V\rxA01Hz #Yhċ%267J6  @4}&_.d &V5Z`6 /u "HXeTK%ͯZAY$?i R0 9&$r9BYliQ1Gsy/&` m_E'x/z7{}#v7n&~мz P<`hhѥÝ@Zү-aPwG^tIͭ.Np5yٌs\!Ƿ:y+ǎQ=uè`ThghS2BjSh4("2o݃SI5V  btهZxg2+\_ %¥}DXduO4%[7Z&c$Ԣy( ~Y}.+s G,2mH`(K&. #k 4qy&egm""(lNt}>WHE5>S.I&IVERHL' *f (EلqWzzN_}3_ߜ3G[\&0[3Li|[ύ^u]!ߖ^A坑ingNn&y1pD B+fKi&ك`Ej $i̝a9MNk$gJTGH@ q1q u-6Zou0e\mE{hÖ:M]ԯtOl9 m뜥a񖎻wEn*i6r|/H`(o ը+UYEya;ZQPH!&wOLRd*'SM0>Rz;b֘Zt2 P^͕ +&~4 cGl΁W5jxevLU}u4n$_ ճV%yOIC>&*ߘ>`AgaFe:3>0ރi47^^}>, Hέoc?[n;~_:e,lGkB98@P.<&z/U^G9;Jҳ@6Z*Yn/xTM! )!<7gFS,38"6f+1C7}Kv {KDy%,wVzF90ve%8_y= 9Dr/%4`B`JrMȥ^Q&ŃQ2˹T_?=6hģ#2@ȜZ2"VpET1"9H} _C&{.2½geɕJx%f6܎ J V8= cb.U;GD\ 8$ J7g+&~ؐe]vvDZQGܥjoc"9`KNYя6.Ur{$@zI5@n RpjҔG-X~Gj&dLh^AN)%~p=źZ9:iɱ~_R:4$1Zi P\ @ xnR@?jj I_<_M Г10|Q6zpc0u!&b#ϏSZlٻ6r$W<ٶEpd7v$|)Ȓg߯RK,Ɋݶ8)f"D|,Vpi/!5h daPNh4z(VTgз^Qԡ wH6yݼ+H/qE0S>GoC!Sw #W+(Ⱦl~Wk/K%C=A5<*'"9 ;&??$*6V5g+4+]Lg7#Z~yS|r:j]h;s.{Nǖ.9p ~=|YN`>}lh%!׷tn76P"{ď!- Ŋ+lT}ZYo0]?~7>o~2s_3:80n"]N`__{ӢinM+S|ux1jòk\=9Xz~ ҫBEyQY^S6a7ElFq;y *6am v%bB bM~9{8/6V*6rGqJIho!fH /\1C#Rsh h3fZ-Sg(=ǡ.e->סA)"L$ ̢iN8Q\Yr2.J`h_vs6'n]]ߞ6>CMMkpUq0iI%|qN8ĽȠ28N`R^FZaqJhH9pVTL.!Bw,n-k7Iʭd\HofNK)Zs}y?z1"|4X qZ0E@L189ZFV| A^2ۮV_S(#0r bW59,A ~wu`}ϖdݷO>w|~-{!/B{%ŶCZZҺR^-/|--HRpۥ!ʔ/=gT0؁!Goktiya53 zV^Y)ne^HϥWQ'thTj c4fDZsVӿ0%s T uufbr&-)_xlX7ݎ /5+;pZ᳇>WOȇ;%Lp9LVZ/ѓLK{J QiT"$DFtۊSI8=M@\Hb *{żYo;qJ<ɳпJ(_ZsPJMRpK(1IL.q*ps?ȏ~ds? =?qw'ްv ^mP]kXJA)* KBd8XI5GY~dV8 n"YQ9MJ%3d $BPD')uXĒ nk5/ \yhPRa<nhw;e"o8:8R,;w婷"O񛢻Vnjr<@NrlG͹1R炠UxzKtdF&-KI2P+!hL@jTe^P/-,h0IQLIR&,Z-^j;+ H۪&NPyi˵B$a!NOP2g-smn;-%Q) ,9ǘsf 69f.B"Ui`P"wK"[~*flϊlyW%\2A^2_Nqu(Q9PUEfQs?h8/qQRϯ㟋8,>j~*j<`g劃kp hΉ^5* WѴM*=pe-]_->b+s7xV=2#ݡd,8kj"~#[Es8AD^@K`OţN9E'Tv#/O尮M|_5 .l!h";3~Q~TQLyETà\05$b3.*Ǣ *jjErYqL]GE'КK&TTLkg@YT!ITw]qI8hZ]1;u*~5S+/oE!~͆~|CQNЬSQOY{v\fڔeZSg_c`kn!GX `/|r:bk2'د 7orC{tCۻ 9CS 6F'04I@h5(SriP"!<9}Uzhc3Jggk\)rhP-urF4P<)-w`ZW$*qh@Mr!e\~a(% $#[CrB琉 S@^KK#gLᢓ*\'71pԭdp+)O$2f;#ÔK5C_;2HA[s*.4H1OhGnJ8cH/DV$Xh*iK yC< ݪ/#3h4"N}`DuOJJQH<3<D[tm̵i &P)̉LiIaIk% ea)昷#Mx"14X!HPG6YnQʛDi9M4(AGL[η HOaQh"SeጡvTr ُKɍ!*Ԫ1:& >14|0RA8G i[ B-`d0ҹȍa5Ok&tN*7ʜG͠fU!lr9x W.opX3;UCZ/g nqgءGBGQ~̅S\f  iTmPmDBJ%0JA@6RjT0xG~P?l= E/o:A&qN(yxA`kQoxמE2l8xwzP|0j eTTz6hgkM|_,QX>_NSnл^c4bv'Ae^̨ŨtWSY9G!Xv5`!;,d:<ϻbCTnw~ dRBK*//?rxYC} َT߂;ÑbiJ+TJZD@K;sg;C;C;zA K ,jFM#$8dP+^{ͥs6IRd*vBFcBbkahZwn6PCW4>Bll Ue^2Ji=]86h2M^ nw$P"_24zV؛rY&Œ@L]zg 6qvNYlǿ|at!$K fvh.Y ha+HGNmT@4OaV?94tr=Lz4:nYٻ6dWlؑ~1v.@]ؗB_%n(R!);A)EQMG-jf]SU]_uuUra<֑UʤB9ae3 8Ni.K6:@O^8z7%/W 9[ ,- -N=%vi']~MB|%84Sсi21#4.L-[(Bj56W9Hg 5%A&hI nY܆ >u?{Ϗ >wiyMČ&2D- úNYw`D}= ә-dgT6Td@It<.5L,9cP&Ř:hmM"o"o m?B q'wZ&yd]JF[M? dDutBB JJE5/{_=|( ^W=sk@spj{N(BHj3_ZK]s٥Iuhۖ0t`mzCSmڹv(ȶx$|אFgMr~WVw9k}v׷{N %`cdtCGBoq͖~NKN:K˓tfK)m hd퇟$Lo 7N~zsbyɯ׀O՛.oG_?3پds88y9k Ѐ0G'{u̒|ѹ o'i=!C7 3Ǐ %\G)PȝLY/6[$P"/v\2$^+`b]EBp \3ZV WBD*ι(Uƪ.m.Bev!,=8FFkJ6=2Gq:QW{zhjDכ4//KXGk Hc%;c|."ԾNg8ɵ{^{ ʴf{Xt xB45ޘ=`UyOS4}IsZM=ڕMwyY! WV^ƫՇ?|sI zBe*W3W)DM?mQӒ>uMKd5-5bL6piZyesZPz=^E =oN,/s6 ϕ"@9HbSoAvs֙H^ c$f(S,D!RE𒋈KA2;޾h\ ˏ<]W KbŮ mo>Qyh=p^`X^_^@5QK'U3W_yYEU ^<T3 V}:TQMLZ90³,u ń$k Ki@"V-%-PP!m4pO 0|J錠≛LZN =Wk͞U9ئZ(uҟ}Ea^>_avݳm~jTgg}X^5s &eJ, 1 #n(*z%x&y|t%JvABa0&AuUUr!`T1WXe3F.[Җ>rm(}>$$GL$qğ\'&q͕#em䘍Ԙ`3Ee<2:pBFr)+x*.F.6j|Lb?'zf-g5FbN]BF)b=5矏Oү03#WKګlG{=}rqb=^nrHș5a@eGWV1 8^WzD+= FK6^0ˍJ)$82E k NMYEW](X$c(18-(K$F)RYL^|>&vF>$qϋhI"2qb WIFbL`sr5QD[F{ $͟"8hQH4O}b < )"'*.<9͇% >"NC{.2½geɕJx%f6؎ J KV@)r~ЏKXv18ZZ[U@%MSU,rƃt[n|`сt !U .x#@!0$ GQxd$A5h bܯkP?;hbF,V#q 1zi.ET&:e~>pCAG)Tc(!J 6)MyԂg|P`B@$̈́50D頚R;U{ԋYcu}-M[գt&iIJc7V(CC xnRޠ͛Z/3aRY;ET=E'R5^oKtvkE?E?r)(8,F0<[B =&xvJVEZ Q*:[KZÅsH0$)R8s]uj=3x>t03གh"uFDRD2B:ǐK.jDLqN$5OTֆIȅBAxgKPRqQ[rpRi9;Fkh"3udK|2%PҊ3? 4l^no6w@aDw1sײy;1(< j E-"-a8DI/:^Bf ,xW10iz mA'mp"!MFƀB %q뎂cPSW|\ +<cQK*r β(*\ " o]s۟jiEbE<8IHs@K 1Es*&byCD)T,\&HaAePI%K@eT)h%ޤy޴7D)6;X;X Y:?7~~wqa8ރ?o8q0#^qg. MriPѵ u~uԾk~Cby50_\YR` ;rNFCy8u\hJzޛ6Y.*u($R l&7Gɍ8!p^o癋?@UESw;v]H3r״Zx 0T1T#Qg4v9k)QVM jC7V\”;+c>ݙ<݌gΏW7]bF:'^r2.vQf\z}A 'NtcOBn骩 ,3;Ap,|&z{o2 zm+jhK潎d$w,} ʺ7#_=gvM5=V7*ҨBϤnrחȎwo_mݫ޾;̜_71\ڰEI`׿޵(57C׊t Vuq5#s͟Mwrlϳޅɛ?kN}FqQky &fð/a6_Wr`pkbІh ׅw/2؊V%w}/rph6a"+fh$Rjp:1 hN|jS{fڥ1yf$G~?P!)"I>Pdri),Lt:W)((k7vl`|s},\xbh.QJΑ3w6Um aZabFӰYӍBL+!@Y`U|@ci &%ٞ|Q<(-NFA Qxk(\#F@+^(6mJ@#DU)(7BꤵI)FP<7I*Bzch9B< 1Em7ĥ3(Ѡ8}2#(O #pP;*mEIueCTUctx>BX0q跡gAhק%XJMĵL3STR9s)9XX,_/ݪWup|,b rl/ =:>cN9 ([c}~n6TR^S vD3B+FVR O\m5>Ur%wn\G@P@".2W 12׀rôʹ 5Ok&t.z#J%XX 6*b"A#`%X1#kuR(aSwW}KRKN bWwU{b &ʶ OVw]C96ʡaCf9v{)!< ,knTuL֑F GG0ŰRoWp.`9ϦaFy$"kY@ۀ 3]ն; +ВusZc Gm5v̮!u"Cշ4dOC_ t8 |]Q6vI=^+9βw$r]w~t6_6l}~P3+_ [ϚobҖUvRKj9Uv&T#N#׿Z0-x3-tn9B0E3RPyt%E 4DY؀EeQ:2̅4%qPTP8hfuE;.y@lS0O ZLdfwC]trk׊; e D>1N:Q=S1t~\L6UhCru^NsT :ok`fIZвAߟEyx7$% ܃Ra:o\c>k~2g7س&7犃#oܯzK8|i6ols@hG_Og_N :|sҏk ߷psf"u4wX3Ն3dkdkݫ/l-G9Ap M2 Nt6RoQjPNn|=CJb #5ڂxs}T q`:-RLT"Opvkis#Bq , SYV 0bc-|ey|5i PIHJ!]vtu !-Gs$(33O/HAKP~;_g0IQ8*w.[x ovoG'GGчp\րLÁ[NsJ *k>IMkuO&T}m%[|_4%YAJǜp8,q\ PVZX6MD)AF+넢"a豔D ~ ^›5೷կJ||]V VȊӋYjrϖ*o!U҄zc5[%${?2[3[KoiEi?646ג/i4Z0R)Ihҙd[&tgNN0Awi^e҇)"|TR 18b’q,|ӃeQHEk Up=őpp*PX;JE "DM7{,ռ俧C 8LcVz#bZ5J'"IkNcS'DwHXd[ 79V[iylr 8 leV\dXD.qGkTK h$-֝ݖ%H>Dqq1\[<&-Rϒw@'<"0P8ט<" H9;chlMTf?R5J:(s^xñWtc:c[?Ajʈ${&`K.$aR/,DDeꀈhAR@Q4`S\ uuksvj'R)}}ن?!nPz̚vwc'gk06(%_M@SU[cK[h)qAvp=uBjf\Df9p>Q$£He N0<`@R(%2:rPGJ8˰ Llxp`$Sl$,Y"Ǟt RR9S;Ŝ(aއk;{5Zo; ^=͉^`ɻ=pDn=Rv@̛{ElrxFy:П>\`ќRlb FD- JpWN0/:ӎOg=EO'y6^k05y.x0k7L"mc3M%Q)0$O+' Ӻ >"G%F07 d4 [X <"j#;c׷z@L_2Mbka8$chWd 6Kw%{ !nt\QvQGVQ5#/Sq* 7q uFM:1~w1~Ȥ fkʼ0jЛEz }mxQz#ף0=1 Oa oX_T|< &ka&ghxzE/\WDuyыOe1+ͩka˄o Φ!qC1NI$ /ey_A+o+6]v0;n|X2ИxIV#^()֌߻ĽЦQ4ܩa5O#[iJR2&7d nxCr2+>)'u:Hy@ 9&, XUwU_DuNP DQ(n_ D]R!R}1:gYV#&6,"dY[5Rq!RAM&y(%Q'n+n!(9ѿ<,xGl[Œc`L3eS|\B$MR+]'e_0E.3)bCC>/~餙B ?Z~U&m^]g_>Y ?BFmUyDr͵Iu$ZE_؍gn6^TotbPܴKbDx F~FVFh1?_d̊7kV-ϳW٧xP|}Lt)bv^doo4y'ôԛ[/Ms>~:YF60'7K ZK}mo݊ $07t|1Y@p~l0V:[ Ҿ^۫-qm>WȨpǣp TcMͲ@fsaQRC3* /Qe,:ug50|`УaZT,A&&6/lP֠%RJUvT?`vKꠀQeͲJ4؂>lfՔJQQq < r/iVUڮj.Jg4dr*!Ca^Z(ڜ-rm|̓aw17ݛmރ#o78Ŝ9s˭#:F !)P~v)[r.`9Ϧa|%JuqҦ6_p쬅Hr-9˝R 3A0:ֹqvХCכו\g}U\EMr1:g(4 * aou@UЫ>`Ôi4 j]91rj#"s= ^p̣ | I'`4ۯY<{9lӝI!w#NeJ G42V:c@pE( nЂR[Nqh.u#&PfTetBF`p' ՝ x:⦊k_,u8[7|~gB^گm˳ L >#fyC.Rp#U0K`%Yg|}޷Ԇ݄{9ygS;چ 䩄g:FJB\8&2uTRKSh#5$D1$'1&-g!)f6[n1*w|Z/6X*oTZng@l/JJN 96BԖGG46Zbɭe2`c"&# ֦8[z9}]\J\2 XpJS2`jᭃ;d|fP4 Վ <`!KD"# FT8G"m"i|Zw䳒 ߏ͚- 5HakA:'L X@Kpdž+k0akoo:ABc6YkYCN?ZƬ)Sm1bZltzf^Z#Y/0R_o&' dqMpb}SBJ_](%Qvlq=3U_UW}ł+(3U jUZl'=c/ԗhg=s ox&y+38K0 k^feYIù#kةj2U] +֕(|?dNˢ0LDN x THВ-"*ͦm""4v^|wl|i7imβ\ J0,LVVK¯L1gc[\xSzc]^vTmU-Aea~BtJןv`JJ$J*!%P=W%`w: D v>m=rE$2, ηB'*dC  G$t"cRJALVHMn8d p!P ƴ_}!lbݺ8=zZ%_2ْf+. hN!Hm9W9SR{ѨzjuZ)mDu_2CԂUUIls|]UZU]ChvƍV,ԛM>i>x|v4: RގoPGVW\Qwl#n^6=U2PQb*&h]L'(Kt:g^j6Ǹ. |h]j- |bB'[zFp2:ʲY$c{oCm}??Am%a;55;t[̭܇5JOhO{\LƑk2n=|oP̰gc[xZl$ۤ P-a>@iieeNVs&fcicìQiϙMbcN%P R>mb:( !TNl[ WmK[#dxS֪ ˕v>Z :G2(1 c>0,I(ScZ:\S* SdgV?\_q]\DC9]/t9.kk<=nR"hToKte=&-0N3ɻ2.3a]@fcJ_{4cnYq7Ҝ{fn~} ݛǶޞj :ɏ89^,P(PfJB BAB2!tC"zz;^NWWNZ,LZ1ME1Yѻ5\D',Ⱥ(5YXfd[R)@.2KəeNN]jt+j0NKm_!TD"SzX$xR(7/!ͅ>LJRlEI$CUEE` t4.ʶx_B/"Đ'x0++jlPq8+7#Xmg;UQMNoԬX_|rvNCo-N2yO 򋻯T|%YI_-U&Zn%?+$Od{I})pce,A)X4, /!`,A0SXDl &͵+TmXm:[zX-%B]f gT つ~ <qBcb",NN;TF%<%)`E}394U-,T `2!E1R!hHFd2v̺,!8#f<:ta:IW\]դcWnV!; 0saDBF僌YRLڐ :aM6j&d` r /0Ғ!@AS H6>!eX$Pm:ީ_qø+|lkMehEl.g^@(TM^; {@XT:{=3ґI53 o $wY. )D Qs#H#=k,jjM~Un\&%EW.f]3MFHyY)Rr1K4hiA\6h˦.֒Sŝw֒][}= +W"C1A9IFnD8qF9nײ_&3NzJ5[J]Y;I tY D-ejJGT2ߔ#:ϟzؽKQ6KL]VœD [}2! -%SܲX[ "o LK>$x.IlS21=x-׮i} =)jt22fOgG'O l2_ZjuҋWd>Xq0̗uR(Fa9@4X29}*:k87H*@~e;1c/P 'eFdWL㦬הuB$H!"!f"FQs26iZL1޶MFؤM)%Z$dT"gDrrfya`eN`=8Ģ\6u"m`5* F%2(9rt&YP>slRM{&"䠼Pj(AliX&]G\{bOKG?|ΏijF'oDKDFӏigHd׼?2O}Ŀ-1ş7_\VGS&Kۦ'p&KkNK%gK&Nޡ!˸s< ㋽=nyl =dR/F7$˧DOoKnG ;MiYXFgѐykriX 8d.T/9`ޏf_pݿG~|~vUpd!u=xLJGg+峽("Qzw';wH@>0bx*,"&L{EO<ŲO'Ճ_y|r|v *ݣnrhԡ{% zyكϏY,>ʣ{|4]lRܰcEw\TLgq(8ן9?~ן.;i =3n` GOߏCϻm>4T[ --l05zo{_J\X G:$xKjϳ߅)?ׂ˶QZ}Mv{e%./14"Ə7U.?ބXe7n,_1@“VW  &8fAjaybJ& -y!7G#+y h]̚9DNBgGH‘%ӈ0QEةSWXxWM[pUaz&A'k.q){JrugݧD:)e;9rWL4uAZwx_ུ $ h@bpM2E(]֒SJUZNk KTdzpeŶ!BZ O[?}#T[H۝ev^vA i΄SJ`kDgf'52 l.JxJF>vx@N&[ab=.ڷ۞llnc)k깹6Η[^|zsz*aF杷;֥.8,$eJ' JB%  My(} {wo ]=(5{ͽ6%'oV[ǧgEzsY:$WШCRAG!xfŸ4F[ɧAEN&B2R\H]u/,ݎVv= ϧ0Nuc589]gcaGtBkA(V:a }ȨSpgZ{8_x~?0ﮁaSbL -z$%M ő< 3=͚SU|x%yù#hyRn s ,Ssb6Vy෼gy;<ΌRxZwC(ID=;dqӯ[),y |]ZݔRp؁4Ei=Rv6)aZs6H=tTT9@ׯjGb" ^gsgv9EM^ֿq5V]\sA9&psf"u4wX3Pdc=ݦ8ݦEW*p(7A3X  & EB;0u\<˥BQjPN[&q(joFI*^6^p,`ޜ(J! ^FT5 Tq),#fSx1JόbPԘ3:o"W?)Y!z{Xsu`e% HغDW(h%V6UYP.ixCEJڬ\p "ˉShLSՂsD}fm%KZwbR) F o޼ۗ'yY18?|Y9jpϖnT? SF_@^Rw =>GI/?f62F\"nDUcٽTcan[= TJFc Z`t&&YJ:pc'G٠{o]e߸E6(4&b G0SX2nR<\ d^q^R :q8(BUC=7&jci-ݧ;UOMcVz#bZ5ND`p VyGSל{kF ѽuHǦ"Hk#} M+֜iHW{Z'{+5ɺmmQR(W,xHXD20B:l$˷Qy-x{HPc.=kH!0 )_cxk wg#4h%\& 1OEx5PSS7ÿV_XHgHrg-Q5?9W,],>' smer+gI }Zx9>}s|{ca' `5s!`頸SO"%Jd*c*>sQ!@ކϱS=g1/hnαIީqx j['qN!Jk &cdH^r6$Qs-964a qTVoᖙiF2}][ܩ@7PH Y3@Jо?ǨDhis)7[P${/NVҩ@r/Tڊ$pD4DߚL9)=uF?&nPx̚v7i&r絽WʦLzu[=xR k"3 -3x{h#)0s|~v?!U{3 H*DFG.cBH`gAD AEj%]ĒE* y Jh/%%ↀY,-,xGI4 :\klAwb-j לCNcHޝm>>MݭdX6}t? /40ĨĖb kL0"Xm +Q^9e\J O;g<5=yl -5y.x5Hpb$#4ei$n0k!6A#&hpr~s.(MʲK>^yl\b@#JX*QbLeQi2}ѭx$_q[;nVo!4A0ɵ r B($<3@c,WL#,8ZI}_e׎GPp}z9o}3׈Wwaؖn1@@@UR uf=.6(/BPAȭ.WXҽus̮|{dtWwe=oK*ۘZwZPq-(D$H 3K2k}Q=ŲM̮ܺmVUU%T aD4@Vp:RiAr%S^[QNd6&vuou'c.n+zgN֡:ӥ$RtlӴ[0b$gXxJ`%aMQ_GqwwճnhpY+K-#,Sq̳ũOpfb+e=-bOC-V}^k^[ PǦ{Y2i2O仢qVhu,}|qUhօ +RLj:`TA"tȸ-FUr;w@4.Y`j2aBXE(@Zb @-8(8Iz ed^Jk JT.ZG[)X4@$wΥ1 UZflGs ( 6>PwF8@fXpk;D6y"c$*׾0//K/b9W\Dȵآ#Sr$9GeoS(W[ ;/`]|67mVg>^$ ] MP =r}@^@JQ%O|&y|t8rxޮ4 \'raվ}$Ndz˛pBG͞5]B*@η:,>+=.]`4$}e4[(1k\ "M`\$\.88#h:g mf/|%.Oґg B=cƀ:G&`0^lTinFm_@.z{gn]ޜg;mib4Gw1^};u.9z ;ou =P Bcf@"ze'#B(Ы^j"WΐBXIRLBZOA;+!ypD&KQDVDD='cy*" H #;h|V_e^ TMel2 f >¾pzBoJ"ҭekt95v<[.hfq,Z Z{@kTP&Zp.j@%JDBX,yp6I^*%4mpERtZ"{Ȍ %]La 0 YG!梳HT 5#g>E#6}5m\QQ,*ٸG~1)mJ`kldiT"ƤQI_*[RL19$CFZE.% , bI5ٮ߿%~Ց[1:qɾz7֋nЋ^2hbYbƜBZ \uJQs& zzh❭X%|H(ePnfWtȝ\63wuLg$!A ͳQň._oF Pՠd! %D]}TCf4%9hRhTD Rg.>8" E:jPŅeRBK'^aC@[+Y! cpEU[ggc޲zYjRNkیWZ)6=?}1\JBdȫ$!yie&2c= t%x'Bކ4Eհ)o:*,{l/F}:VLN` ZցduvvtJiY"qdmGAܗcPM+EؗeF']?dY 17ru(<$#Ylr@qkiL.3 4D ŠTRΝFD«Z2$f Ki,HNQC!$)f5ѕTݬv"a:c,;ot6؄Fq~=*߃$QO<~]OkFNjtW`tI.5?"CzWo0{=WKa=y}mqv\aQzKmۚUr9;ϴ{4}+?zmp91tpOY0|m$jӟ:Q߆jq'43 f&%ڙP?<ٺi`4ʯ6̂iz(^ՒO'7˅}s|6pkgyCuݳb5i)#ubT* ~%O|Aq56/-~o~x?Ϳˏo7ܛ?~VOW0!$|6iMOk^vDQ^~fq;ֺC[ -_ĚyO̟rBc.,16HwIxؒIm0X9Ȋڈ 8֖B`_-դ OzRt|<2q'Kr>#d&>Yfeb'% Bɸ$ai3cIbj!e컻y{Rw&uõIp:%OC渕ꔮCeѠEtJa!6 15ֶ7!0[hyQۋv]!(Lcv^xHK3J. ^Gi(WM: scHeB/4Zv=ɽ4WGttU-BFw͋kA}C?kM6HxWze|[-6o[ڱ2w|nlDH `HKy vqRP]T:#XziUQgC*.I2gkJEK/cluT:lcmkI*_h,suLUW="DrRPʐZmF?% 8/tbN-ǯs-?5>]s\ }ԏw8V0\^9V )F)lvBLQ+luH4o] Wi5n߲e'qPkgLY^D J7"ZLͶmmO+nj ]Q0kϭ7b9Ӓp^^MkwĂ=6aRR$J$/>|X0큳`l Uؖ/DIAvD3ģvlKof^rrm=)IFc_'F(u9vlw֜ؒc̹ 3sZ4wJsu(~+т̍ƓUTtV@ȥ^\q\TCIA<9XWVzuSx۳ܒgϊgokBўi5*Կ؆+x^M㺊 G9 K׍emǗWS/nE[ȄtGI4PBփs^)*{)"z~$u9:ʬaQGu-Wnk6ֺLK_xq.E- _ :f/o.S՟G7%Q!{PPi5 X8 VVkTT+r#՘5o6ԛX(ZzɄY{_&-8/EU$Q#0굍nK_bGZ35#@Qϯ˟ /E58EyQ]Y{Dv[fSk?qXl]9r}ב B^fuu3NtleυYFSyNK UTK ȇ͙U4(ō_8{u+}H1 Jggk4a9M4Pa҃:rFtP<)-`ITнB&6L~a(% $#[CrgSƄ) z3$ M =u+7JJ41F^HƦ0o3 <}[_[2H݀TR7o[c5&"lUxݍĸR+K> ViJ>1SH N<-e$FDG&H*ur>) L+E "1|'picM{x8V>J1'NdJK*$k Z'H"$ cFP<$NDzch9B< 1Em7w4Ѡ8}2]hw֜ H4q^< D C::J/%7ʆP('P}KY1܇?^3J(L3STR9s)s[ISGs9cÿN_>n] ^9~P; F&,] y]nL0H sJFi^(uJ)Og]ppPTZgL(F+R GT>4|0RA8G i[ Bі\X02\0N'5:g2瑛s;YUHg(968l55sqJ4}jq?/\LeƝƱGG0{ttZ%HZޜYcR ̇RPn4 ^Qsl<Nm9!^9 kc e1)2A-rdyI@.-L%ʬb3 }Z1r9[_O d:鐀=rOM#wȬ7X/2iYbxHӮPG51H$.&MXGe%xG?HXŹt{RirVep 8<:d I֌ɠCJv՜%Fq&I|\0`n)0޳|ϒƕܳ<̉tD x:NTXHRҢ"\{<Q{R2 |q#" NEA'A:7J^nciq.]$띙[V%.BHZcLH{"hS:k5tyECTpms̮*~>i=ʸ>zHSH&#-.c$Pv8"_2f(ii)e"XĄg"&)}hgg'F\4/iACs>G$߼(Tr &ߚLm>uV>9gq2kmޤԛ>mlk~cч LGp[=ŀsֆ$HeV8T COoG#L^1ц&Iꔔq`^8SB,B$ &L֧t{ Mc>f ,}H>mf|Zw`oQΞ$fGJEYTJSǁSK%0o=BD$I2`\IzӍOggW=kUOy>7̓^^Z''*@DX)D'8k%(r6kN4O. ?]I*H ZRǏ9;nks~TLj#N~QzAD'D' w&Hr@B_ UhFf] nkԝ@Qמ6/uij7W74ho <$K$*aHÂF'9/D…8]¡[rvA)(|Rv2O,eYř.琛37CHA.H1DL)?nU4{r``sCpYSƓIΐ4k-Ίn]h@ 7:d(&;Rm`m!WM5W+WZ%f%uT~DLS*FWo@Oo{9A^ny ؍AUoXIښ߮+S>ߛԔ)/P=)wËD cE[^%%\;UyzNs{nR J?k^PS>TevԘQM5&Ȏ}CBAuw;.,-#Ftnү .Լrɛ 6|\^70<Et<.5E4bDd M1u }脄^'J/N fAw67v"zv ^W=+RMm=?H-S,a_ü;ɛ/EFя?.5Q^Qw\1xk BgޕX)ʏ:#wɑ#JD9N^2֧S͉=TNQi"!myߩ;tj+[t [6&L? ֋EX5{";Y6sK*̾K10!-և gs~1Dyf9e&;("y-&dN7qU -"`4é*tN\<lFiK0T&u8Q2y*N#+G z#wdiI?O0%i%L8A<4 'F;J [4z=٧o푋[dyǶ'Uj9rW4*m,"%&&#C$VF!`RR HI홢kYL B%r {F%lK>3U !"C4!I2`ǙZqBPܚh{D*9S{q>P.<0;eZXMꎱwpaVwt5lƎ<**4t8+EtR 8=לv; JvjQLlHD>ģ\K.ANpF*Xc5(%!L>! (W80-!eƴ 43Kv֜]֡A ҅ue`m~"qlݞyќ4&/[_ܾOd|[7W⢢Pz|H#r6T `&:N* $;^ΎFEej{ȑ_݇vb-~,vf_'$ίb˒_bɊBr6fWXIRrv!CXorL| 6ճ JY.dҡelB)8^]ݩP"xXϚ gK=!WП 5Q#V*R2c0!ʖihAsT{BIPrP*[+C~j2FڬPJFEMzDl48 0*̫ jxIOZ)aoZ@25F%SchY%hTJY= ߯8f2Va2mImE1>jHld+Ư5o^:f.rPvc]:;<h&!jo{t^šU:ƿo 5d0|&Y15b>~A)6>@TX- v~CPʿ3qL&gi(/"cc`Ŗ[)vy q|{ Q/%J͎cO@qg<A)T@hI(cJ%eZǩC(/z}{zΏ[eR{D222[*yLRj)=! [U[E }en߁6,n-w}"Q9wc_M`9?{lSNSV Z!xt1+qx9zgu2q zGCDAtA*#/IE1t)^gUk嗖]Q+1Hd{+iT]1MFs.b4k]׽ŻdͦA}Jܗ S_Q&lc ;wr_OׯIO+p_ߏ*^3zuM\ Ը5(üW= Sm16iL_ ]6*C ӡm4šÊd7+fn9s rQ  $s-rA' + TDU29\n![g..A7_/MBs/(Q@s [6^oOfݤн/%0@0|RR]]P5T}e@TQh+c¹D0Hƹ6]p59#h:g I_,KN%N>IG @znju M`A٨8p@m ^Oϗ9wf?r>KhmRa|G1^}" ؗ}_=q&'W.L@9*jU`:%(6z+Ы< *쥶."x *-T(!u$(OmB/]ԴbpI-9"B^""rHU %4Ckl;h ؊=?Y|@dOy2"f/2EeɥF+M̥h$ha`Y /ĦJ(M܃(>.e)S$ad%Ls(Mdgp6NMҴ˃:LNv:^ lyl}ޗM윟3%մy}oUwy_lArN8_=7%Qƴ͆Ŏg|bn&b`1'hB "TSvEE,&GosQFE4fr00Ǥ)qOBl!:6I!EƤQI_*[RL19$CFZE.%fҀVTϔi5-cW.κ+:$_g3).v vq=ZR,K,Zۘ^TS?Q+unZ# pi`aJ:x&Ćb]r6~ܪo?xt(86WͣQՈ/nڨ0wT)aȲCCbW4hR o!'$p|WD XYd6Hg\MzaqlQJhD[+L%2I>2:0\TTxf8-r1[/K i { g(ŕ5=s}~޾XZn.xRBdȫ$!yie&2cDfYz(bDwmigceXUIhXLX'1`awVJA<GF8|R}IeZ񠬷uOjVBL 0[%d$\@N(>XQ~riiEʙ 4D ŠTRNP#k "U]`xb T˥aiH8NQC!$)fѕTݬv"a:cw,[7lL?si࿃#$-(^^.&U=oi>:PSNДؒ\~bHMUf!_հ\ _φg,7<%YTz$;># T)pt.Z&=R0i_lOh컫k%$e`E6 ֨p%OIzӻ-'oT)ɼ_Jz5;; (&PxGYzK:`;!-sS .hHomJ\Zq\JoT/9h_n=ݣսƿn/_b~|Xu5dR'ospM~y^jr43 'PvoOt]7vsSX~eE49m`źMƗˉ]yrz2pk{Cuݳbi#c4զ/flʨz_#9:[8رQm܆F ?LGO7??7? ͫzןUO<sJ5!!z|eY߲kp | }b|-q=#d!>Yf2}œHY!krd\04יPĮumvl}si3xI?X|pYz( %ȧ& $6VyIBL!ѠM } ;O/[ -$ƠzB=v7Oo)o[/NUzf]дD7<&o:pNuDbЉߣI"Xs`5 pϬd0d5A bR % B*kœ4 ʌ d|ZKPJfXL}Ai N ;Rm=`F͆00k!%>f,So/Wx]iTだ<Ѡs0$c0(Co|)N0A Ej$8"0E2HeCBmjda|[h|.u@pJ|yMַY2@WMYw=ȍC}hw*tm0_*㭇|ww_wons2{=F9goe~4|}ÍW+1-߸s$j$$H4驪*r;Q|**Ul!nI˓@T,pDs|6_$; HNGB9n^{0rHR$HMޜ(J! E j nnP0g8!gQ05N!lJ1#&,=ZjSVO=˒s`Mq9LQHBl)Y>bE?O`8jNY1'LgYat<{%VO?es=ޏulD^wܜ _˚ܣa.&6ꏮj{fFpú9+mtuv0;Lq6Y,o$}NZB d. AKYs-;XG l.YY' ϊO~- Jk2jMKk8q1R3 X`$ƖFOC0#RWQ:,m>:U@c'GЮQίһ"E614&b f9ad 2rzp 4 ɼ+i!K8. uXEQ%2ޫ )rnL$xX[MO*y7eYak%SBL:5LZO]sBc͕ڈC˭ t q1wx"ʖV9?6sҀ"`1F۠LnL iUnXB0A u9c0istP)nJ NFe%a2Umը3D&gQsdpy>ǚ{O\Y3"I!v@;':z 1c &c߆EmI 9Zr;m|҄TY(iE4]}i6׹}i 4+Τ'բI Uh=tiٌ3ݒzW(%YsTG*.갾~4sBVkj6*/]AQq!<.Y5|\/B՗~3 pݔT-,R?w c^|(:U|m@ "^\ NuhX>Vt܇T/aJ4i刴)HԔ1т$` (pHAr,5rQ10|إP-'9{k"vcLl.G&V}!<yҖ}BJGJ#!G42V:c@pE(PBTiјGc&PfTet#GP2*$;A Utȴ :9%x8ADUNk߭|u \jshCQe랥I}\D߂hPLHsɭf1kP;w{;=[I +BlĞP}>(9nsTaͩVZtK/i];c2dy޷dYg~ރ l]]5>:aLLAUσ9i1Pm XF Qz8ט<" H9;c)0I]C7UM#?xwosG o\+Ty ^Y"$i oZ$pD4 rb4sK3⺍|{v}dkn"S\iW牧ޗ6nJo6ZJ\PAp]d&peoOm2fO-ލFH\q:IeB1Bh V0)L:,6h0"O"vq1d!CD{%X.KI ! wK9 Q ƵkkHv`{V׺Ks;FInyo N(WsP=Yp.0hNCJl)@#T `x%8+Ke;ӎ'S=+UOy/p5< <5`&6N1@ (v͓uOv= ;k^"`n(Rd4 ->VBj@/0mZ بC'IЗzٴX!׵?01Em{v=@hDd/ē׍vb>cwujuQ#.1h9E\'pC΀YgZzlyQ۠3.\\+eHAVYzHUXٽ;sq6>Dsx) |y>Ⱦ8Dʪ#D#C)}{jRbOo3g΋&eO1FfgN_8)5nud@oY"˪יzNU|SUlEURմɿ[ BUcZ8Wdx$=RDY ??sx^Q9۔qtˋoo`v8l"d8 ZYm6l>~ۏu￿~9}ڪC-69kꤓhu[ߎOhv׊E_,W5>bTp|r7SYxY~izѤ8"w($٫*IN.R*ozoi!՛[os_حiyw%fI5GksW3J-ޗYuR 0c<<͝je)YMB\7`ռaȐ[`,NqָO0FP"EV?,s]f>Jtay5㪜 )NG/bܡ x5B8w̜xTA;))x E_\;4޴aAu;& z +n ::6z[bU]$ǽo.< ,1ę #$"ڈtqmB53gx'-ΥҤDAO(z坳rQBh&DGA2U]>R;abz|H--ʵXn:2Gf'8HMJjxIS^3CtuJ8œ2zeZ㍭]v2Gۯlw_/8Je+nv5a? |goYaN3A~&̯ldA ){އoZנِQC=oйz)a\ax2:W#Y֪MW"f6(Gϼ4dd|¹H9I )Ą/BLD>$ chZl7<+{V59Fy_"\;'Ooz8܎\q|߾~y+j)wKm#!qɤV?RP'1\>#gp ysN{D\ 6rjkW MV* J ĥeDyV 1h *RFW:+q@^dlT#ٹs~|fblY+NƦ,OJAHU9I6+Ibf,ΌCq5(T)UuJpU(Dz͊vdk;&3/ƒYu lCcȃ\"Q~`.[ 9M׼"7-T<<Ιrlx.ב)R>|hu0; `ǀ%,XXE ɛ B,6%dzr_xqQ(oq80?5)(qPf{62#H4!q^Mj)RQ)\İZqA>VWJM @VbBl!*Kc'S2,Jq>VsB%9F"KeZLI J&u?'[_:GBF?„;F] (Ȥ:.NSi޲ 5CX;Id3<#KƜ 9X\|}1>5 H/ZƩJj[+DC))IV̶:ѝCQR#LkV̇s&k#U>9.d5#xKdֶڔUPfXētRZfFT,CD ɞ5))idw"R4V`fY)dW(! QS׆RDh 2Q[ݫ N6` B`:>vNi2Qh? QLnWˮPbI1Y cMpmV:E ul@ Ck -cwk.4m :)TI8)*zy}֡smPUPzD )`ow0'Uʔ8/b ~Nc; y [ZwUTiRS`Q1ЫDq4ˉ'sG4v(h2yn vEm%bxeY j3xWh%Wk1h a6 97Eq #BlH `L1R"80)8Й5TX6Tr QtGdUmj({!%JE5nX7;KQ@;hVCmHPSAHvdW.{/dZëq߳.ٗ3*FrtU`|M6y-Qd ՜&45<oeXE񮀵Ck>84iȀgm 9=Z UDzbMDq҉6!EIJ!䄈6!fUs?voڇE{^$C]Z|ou/#B&>H`"" o `9:p< (}rT2MwjI#J2a1'8;্j,(t_8g@Z PE"LԴjPyUA>xmBV58 U8m>̜yOy 1|Y%@ VxKEl3mƂY,TGWZ"hFymV Dw*<}XYxl핢]@,M$C.2T(ʈ>z \{#BKzt)}N߃bVnB%| \˓`rYSs)f*H-=V nK)Kc[ %@Wzz ;e!3ڄ`1ctJ߳0==c~XTי5Id(fWbdU$zõ-Ƃy0+ !.}p$,JH'J s 4n1VYG&ƒ5$RYIe6OVӥ*^`^Ic-A6@*tI6 |+T%LalmRi6;-,fwm5Ek)DzI Id05I;)/Nӳa#ҷfĞT23p7%<%*`rh[S|'Ѝmԭg JTEJԃ ( hTU")RCb뛷] + EISL1. xrSpM9%ևxyŴ ap`R p#X("#NW=U4+~(] ڈJ5(ZS{NL#5ZXV=ؤ=ɗ ]Y7/Q40-l]΁?!]뜼Y%ϠB]!l z;k*Mw0i#j]>` G{y¦3^\ q4S"qfd98Q=x_zul)\H&"ˡꘅG@VRpIFi:ub鹋hI5yIp]ghuEC@L ##B`j?ozF蟖}ߋWkGފ(T+Q~z#6dM)) mAmkmܛMq´6he꾔42i(9)ٚ|>J K(zJ Xi$+E%'J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%з $9)`p0G !ܳQZV:~'з"@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *D1H DGl@0}Vգ@dO] DVJoQ r@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JoW dt9)@|>J kX'e%зrAkJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%зpkwz-i)]\PZ]o}s=E ᅌ*h3bqJi1 @iY]g+A&q岾8qA:Xҡʥ"/ >^]6XzMf8?;폺(gZWTW C٘^PJ;~mExhL52$ P]V8滃3k$Wi`M+U\t&cV*uޚP8#{$+%y?$')>v1>.lYP9f#jnm?IM(΄4| rgʼncWח6;~ ljjWx ~?dg LrN^mx%ߛN+d"3//2{o?>ǢYB0y䁡| $?mfntuL3ſ. Y:_ӔqpgGo><o-9K gtHB-ANGm8WIs_^}9nE n2|pOm3@p'tIjӣSҒ1/уZ ;.pX[xy5>h4NԷgV ll>Wvv/7Rat(g]pc~l@z{? ̵yb~!9.1Woõ[sq r"J9;"(DMR+olH6߅Y\\.2s?n N ߏ,¿1\_BӦT7vDTx&.\Ա"jS ̧XW\*>Nf{uxvqu4g/E]j&Ͼ)7(T-dU =kUZhHRuf@.)9z6ARuO&#/oݘ'/{/Iz bSnk/d#s2VBNٵgM sc09w'e39lo@P=8gs4a{[njf~ov~m'kTեhu3h/Y&ݴRћvTnc9WJHr'/N*Ùk<`AfNZTݩ0'GRq8}woRh5Ms%\OHAk gg)(#g/)D;g{۶_J{@]4Ym`㌭FtM&nHY~Pq$ &3F _ <0"@ŨƱ>}V-zhvW␕m@=J)*mٍvq`GYH_*3O7zK7mୡdԔD%lBaAܲQ:$ :gT:;Ő?7'py1~G/{N6M\& `ԲnCVlQ!ɶ.6Mcsa'yOA.ha˟dJw-wpvs{4 <wUhvn_s&8MG@;,)WZٹj;|[U;lY?IV%*G"6U1Bh*hJŨUB7WQ5]4PM~orazT9*M7O\(Nmk?a\Kg1D ps!b|{l<9}⟧m_noFj r ܚ詫D$rʺd`ɛv?/gYMV/˶'nYg7U.ώߎq#wfއ޼}7{ߌ~/ꓵ/W)y1_\eCuwYʌtqCi~Z~=zv<g!b[.{w1\a _&PB| -\kv~v#qׇȗTP@`Wx>[2/pZ&ydSmՄhy6*HHQG'$0!U5,p^bqˑNvv'~ 1}Y ϫ>;RNľO7aӧqъxA6 e+D!,TP>KZ'r=a2BdBΞgЗJS= {i6@RAKJP!&&#C$VII `"%P^r},|1dԃ1We/ a=/)p|9ș*;6)9t!`GL-8 !(nM!2^"{Qq5ߣ `߫j=]ҚI[Bi،,**44rM@%Io|S㿥 ᝖i9cR"+**L{!^1+UUL `' vҊxLY㈕9MJ3< kP5 hT/Z&c$"y( W㠎ˊ\Rkie6$0Mh ,sL0`{&@h"B\ʙM@m<"%T9?t}8WE%>SP]*-'5$EȽ$edD@b:IP4 8D=%\@cUa6>(!R-Lkt^Y79nOCf*5 Gi|$ &&؄i(X!|Pia#B{)j$ +9 i=Q53}d?8q@~AE;ZpT#%sPCVAa\.4n>zJRndO5pHwzVu ;eR)ۭWPT&OԨ EAM'nl`̢h"R*U${eu峗Zײ]ju_1E%u|Lv`~"ەSYq۵(pKWgierT B,Ug X@hTJb+/- f0`0` X.K݋Xj#~vE_驻bp%!:ryY (" MDsF4k4OW|l}ۥ~ȑr /t5 0ʮg\hb3FO&7+>kv:{794Ȱ?THWO@=z7{ۧ2#أZe|uLB5q*=ZިyKAD̅<"KMZK?%f r "*Asbp>z/ՠ=ل= ~BM:q6aR/FG% KP"s84`7lv%rSbG[ %ՉI]rgW9硰[8;Mldه$G\u$JJ]/%4`B`hv@TAx((Fe<%TD[FW dne"8hH4#2z 8;Dϝ]6N<9z]^T ]1<ٳ\.e}FlJھosER7 onԔ;9s9-ʕQr$_PYӈW8ڥR1K!xljDA9;=o%@*4R,m]UQ*KȤJsbltrf Ua^(^-V͘WY39_l܀#:,&u/Tn2αA'^ {,=6{ 2.{2-bcQK$*r>uזA>@YEk2!@$a෶iп?hڅbD!d$$9qG 1Es%'by.CD)TL\& r0!X2$H%S2t4LmuoR<V/D)v,2)wt-[L f~6Ӏϱ#ũ`M%#5ĥift5ޯ 8h.{_BK@N2kj}?ޅyP&z$o?si|. /6xAunF_eղJjW(+sT*`oFީ=Z5qSCs⼽b]9hp4/ۛ]s MZ4)/t~+Yݻtѷ _e֑bQy$M'F7k/ƋM/'D戴l^{ɾU9%U{FeS_ȿ({͖;|T*aR8d(爎yӫӗ/eճg/_RfN_<qg(u;k``~ -?ᄡ(4-͍aiJuw}o01,a%DϦqruwל>C8+?mN(v Hn՚OB;EnVCwו\  1ÇPfnclnkē&3IIho!f^*bF"B'fw=m_YH69퇠wAqHؒ*nolɑdYɲ Ȼ 8cڹXz;irٿW'P{Dc1\я+ gDV ;>ʍNXyrj;g'Sw٫a3!dW5A%NlqA*Od8hojNUB>\qЃ>ꁞ:WУOԹuMPDem&G#mD4kݴh6-z5lhug3U6G;?Ajsfm1l`2#z@W/)3t)$wd57qUHA# %᫦ËȚiwϒ J*`pò) uтZ&f@swݩ!Ec&C'lzh3)%2a>*~(r=ޓ"J^o=z(sBD09}HV} rdfI^Ǟ25bx=189Bab@nHk.V.ZIqA2,}^luu`紾yyE(RV,B]Jh-lDGCuKn#8^W1qݥ璺R9-.|냒xA6'q >$ yWi|DXH&jǵIT$u*~v2yg)ݳx8Mp~]2oV޾y 7؅Y '4.y@IVad*pd//Qުy V [5ܲUSl5AϠS4J!EZ( P iqPȦɚ ` dax{XBMx(s.@5509Zh3.m8dnY]]&4%5! 9 `V/=y ȭ&==[CM7AXg:a_Jgc i @ e2FIX C)LKtd..'G Ou^m3YZ悶| #uw!;DL !WJT< K'h`NZh\BWkG.{.A?-&@ۛ0dXmq&aӣ.%2TT2Psh;fsAԽ%?E G9^ΕlCh˨0A=q'jzPdjG'A<&NHs2tMUsڦzyy&8p{4Pus. F'x#fD_0Sbz@x;)v8Яج5;bE hx@+AX(拖, E1`E{Ίn?-̾{`Ew*k*RX=J$SLJȈE&q w;#/>%xVjC2zi=zIBmtki8;etMr19g RPB!".PѠ\ OJ#iA|{7lmiSbmX~N(Eayz׆ ɤa?p4h8>k^Т(WDXIfr ?o9YWoYkK?o_slAq\Xfl.QӲaDUiT!ӧJf1'- L?Z<9I8RkC]%e `8`or~Ҵs 㩢Tɫ⛋?Z}M9ɌZNhiA-I]*@rE )*+Y^glr@ Tahy(ԝW7\s)wzY2?@YUlZFD(M#LS!5?/?Р/_LߐYm3y{~JMfN#<eIX=[GT x;hzN?+#蕡ҙ(K92rdFhWl %+B㽒 }Ѩĕ-n³|JgI5G  墈(ViHE%22Y6' xinb;•+eMs.q̉;yBS(: 9w{ϫ>?~16V{$EpRAYt)JԽ6G]JIb sl8OS0׾=;2@[WԢ7F+~|FW[bҨgdJIrE*[}ǐ(@ϖؽe$ )9R$B68d6)2#e2d\ ;tm\i%TiZ(V'k**\hn蓋407yFD6ѠsSItE"wrt6aq Ys>u,-w5YU* @X/}~tҕtdHK6,ۘ,I>'94+t2 ޖdJp^ #2M j>b$-W422YKRw%PgoɅf~M1@9ugȻyimR?oJ9H|p l4Xr))9Js>W+-Z*B΁%#$$+C jnY䁛swR+ =:g96pQv 1bf ZdwΜ<-ĜwأBG-ݞn(G']$&-@$O[XՆzz'FzǛis!g}r,?OY5KiRY˭/?6}X*0ްx”7a8/lԕ,\^g@VO#~,ޝh2LPm0c^ڠIQȋ(qJ T -ZuN9.C'| b@B,y[,I@ X:kf|"Dc)Qas2x<xycZ1Is-E]$$4g^z>z3)VbBKP:͔ yi׹t_4txj~k5B$nf2#N[XudB9U@I%Ghe785ϋ\ϧ+z1z%P(&Q+NZF-ϛqfL/L7+;߇@wTn`Se h d%"^j3re1&t|k K pCh%@q(Y,}^("9Ů#9"9E(RYn5 /em%^(5d ;u 4Xͅ'WZF}V9pV9E D]=9^I'{x&\1OqLbЮW!m M`irKkZ8p^IB”LԎkh͖*ђTeȹLޙ%.L)hWb[2U4]!Zo{_A}\`쫣jw\$S#ÀĚҀU;je$0hkAhB(5 k0pg@i.A!&k2r"T Z RBNh\Lb0Ej@srYF5v`tS B '#]q6!am~фK= >БՋ`}6A:M;=A\"S 7Җ'0.wL(t!}(ÔCz@l Nf'UNS(_ւws:fI`9 #uw!;DL96C@˕ sJFRrZ0G:Hc2l䬖Q߄='+1l*ef(q/0*eM rFhЏT:(1!Y {$m8XG!1)DXҠK0I WG]99k8ɩKl\\ԙŽ\\2qsi\#b9.dTt@m%֖!I{ 3\ܱ+Po& 2&&.o?>G&#g`pS׿TX|oJO%'fr&Yu\!CsjJolVWml>[Wx;N𲥏[{uU%6]8mz`YJ4 V𿚏Ez2U-&`ldTa bqbT;F)a\TSBԲxKСsx'`&`śk~㏃Y-zͷ&~pZ: ZҢTwD24S$t-s߃afFL\1JfG;\_2SjKDTRJmɌ6>iiL0&˖>Kf ֫5qsx6=>HSDrܿ7[sҹ\75;%)|f$Vͫ ;.TS==.>ʾ'YP~ _3sjxv^ Q/(?z`nE=T8b#0_ ɯrXSNGBlaᐗkzKi54b*}nI67d&2mE-#?!GG6{lu:6- S4ã@\L#24Ey\2deU*@3l/>jplHVp0;y86=)L5TK2pMm9ÅJ"Q#BGb\yQhVA)Xw͐H8Rb&yd*5(K)µDX/5)>]l䬙d*Mw& TlY{uJP)05mlbl7J5[fKP:&תASi,ͰgU*2VJck,"eU :|PeX,#DJNcRɜRafXTqXŽ@ Pvdbu b\/&FD򩍇nwYolu+lDuL CHRuQo Wy0  V=? {n폥Hރ EMiHNu?lU`1sis)#7&%w"28`Et+X q0#4E* p,Zno׼OnF tCѷS[SO| NHrR\ҕ H"+bb$8pm^Ja40.8NQ[쮫yRL"?_y xb`*A2%W_[dzDԓJrKyѸ;NV]6=DB JC?.AWP JUIqcLtTt9>|z%FJ_\^v빥z6s`/8sӛNn7^1ZzH.ں!P<|C_ɨ~}4y׽n&z4s/cfm$F^G)%4{$ú}W,R>E%ǪF'G - uc^p9poߤ~~}{ӛy}Ou`i @eYk] sԓ "6q\eXvTULIןzȥM]"fkvrP/| E}nDl(s!Zf㯷͘]7 `̀@[bu̵~pbR6R&Ķ[gvRdzT=\cLFBJvc;RV矻thw<::]?\ǁyQF(!Pprur+F`d05moxP5p֠SXxyL9$՝{dNϾbݛJF.)],m\#MI-7|q} 5P0vY4!aHmoj@ HysQ XU޵ۀ \2n/硘vycx/)+>T(n~&ۊm_ZM_K /f(:5oV)|X׭\-P7*/ #'AZ\ WJlAѮls-ix`,Rd eJa)$m=ʀ6v@&n! V31O%FOAH$ʦt%omm'&gڷ.9*ګ_bx'SwBܨԏy̴CuhvU9#)z ?TkAZ9 pI:v}Pw:N+N/^:ejMTO4lr7G#u毮C= &\NNĬw%9}qpQ40wT>wk'5Rsjk2jޛ{}Q-^U Wb]Eͷ))7%1tP~nsI2wo_/w8:,ނ:tR|D9JM}|R=`)%_6jkez""o{o^iVvB8JRG9I鍾ON0!`r-@Ka$ޟdD]NFDɈ8ޝdDѲ8BrDd.j tTt,)cU@:t8*#QHY5g0k5f,`ZFL&Z i| "EWtd 1`JޡH(A0x"9[|k]vbn%BO>ՅR%ʓeF[:) Q/Gã5 *yT();ގ.gCit1`NEi‴, xLC /7|FI1omw\yTp]|IqnN87i Ú $O盆Wdt**"c2Q7A(3PQ+w v;p pL0nǵ?9aImO4-ʒ-SfUΈ\Jڨ[ `)&y>7Z`:mOov5vYyhEwVhY5ofSzfMg/ny1(E/yNb aP{/Tޑ/jA88i> _+2~D9I-45("xY nI@ђU,K &P&FkpҖMòO򧿿z/Ѡ`:<"|VKM5*W9׾ʐX /]9*z|?' Bl*}u7ʾ#$3D䪡7֚,rF4W63h@ -n|V2guVԄ@Dǃ,GPlYx"˃YHQ}6e x*#2O]f baLYAz}v*qSp7(w֫|~F<ORMNfdp=`S;%NqM)ɐY:G:69s_t}mI8m-'WGct4J3v7Gcf~l݅Ĥo-ȕ`U h2Ҏud:P9i,,d!HIJVڨ2 B њ $RBr兕X;XXѐOs*u6Bpd 1nj%.HzdYv *Xˣ3[`Q=cSN:eL`3Y3d%u&lNl8U~kb?aVf4BL H@dqaG'[V4YՔ<'ۄr1|{ Sns mgrq0Rb)Hjy)Wk=\W/;R{3ٷ3_݅P6ms1G9/RZo̕xJ&\DI=O ǣ0ƓףzUzQlsHkaV9H2إy&BhY0ޒ}z4%s8<8k᤼уbh5jV[oeOqYk=^؃W'?Fwk"w)n%nppmb-YR}? ɐ, M%j M}QX!q^.8[:w:F>f&]i~ɸs}G)>K``>.^9tUp0B0qkڕ4NvJmZCk>˲YD\OǼ /Mt?Z׹.hp08\{'ϞI: <۽YTy[;2N2.WfP U6ңn6;`kd7@k\6'̟=egk kvI,TBXZRU 2S#&ZN^d%\vtP!!ak$#P%gZD vR2i F(SJrKcY&S3NdDnӑr:ެl8[.L^ю ~ ~lֶ|zpnX-CϗMQUM]1|DV EDZ93xTک> d ;,Qwb,7b0ןa 4u2gl\,Uˑ6WkĬnSP}G0>pLbb9%hؗh('Aϧ8l|vVHOk#.v 4+@~DǚC$޶ oP4[V_W!Ij7G׿Yn9i- 0h|3x\+edu0T j&K1֬'DEi=먷vښ^ͯ'UC>Zj~P{x۰}~;x]_Mvf 1QyrSc2dq,]\}a.?8=ja5X2^Z\]oiM.qw8g2"rt+ܘvqo\~MxӵT?=y<>{N#-=x٪wWg^+5os<Wfs֎q۪/G۹Գ< ˎdRHydGgiMnN?t-Ɯ+!VPW.YVi % Z.ُ-3r>df/B{DZtUBn]\Jag0Fz,DLD?e sz˳$/+H(p n,'aRMWХNcJi&sMG?ohcRԉ)i|B0 s}HePJ芶.21 NKi]Yq i$tܣoѩw9 1B'ʑKEw55x% x2:~ŵ/Vqa5>EPn+zX}n$Ύu٥ !c$,B8ZHm9W9SR{츲fy_ՠ@Q@{m[z|WPzSWkRO'fk?"98?*@\uXBqUPUY(w`W8x*Jl3K.Vvqx/>:}.(g1JW.XLq%HH:39,8uȜ֢R]ffU:bc{}u I|ɸi1΋e$3w$׶{wzn4s&}GZU u]h[$?]&݃Z\}ׇ\f^]M*G.Dz 5Ee ʻ#^{@PO̱ZU$NRȺu@4V! (ɚw4"/;e=jTz7Zw;RpL9mcu439)[Bl̩P'M,Q%!#;TllE.N 9lVҖ6Pd*K>_ :G2({!|datYQVh=t6^;QJL&vJ'u4 .n}o7 NPwt=e- jʲ:8h)ydLIQAf)*{j=j< tJ = & IB]B(cL3e6ǜ`AנsR.8Ɯ(%3AA@c f@WlyP2̸6O tHaZԛ̛n-瓸ak/W5~M]`^,"DT9,jSDǁ4{(LY'`{LsLugٳQhC R{u$pJ :B\E@өy})(tJ:-BȆ 8&u")`6[Zm9;ٱu6-֫ jr ZV݃.m> ecǬɐ[&䒇_2@lx]EC~hw<ޘ$kFJ#8`k@9%&w$v 4ȟlWyh$Ѩ%llM4fU*Z4w L)[ \FWU5ԊqI'w˴9ń7^7r$9~RNet}3??[\tX: =F튑7%Q[C^I뢓͏$ld]=#Ř6K3ם&|d<[|*gey4-꜂BקHӲG@M`׃߶˵All4͌_k\#y`UsZ]2,PdKh $>p *+W,Zde`\f2nO-x|NQQg3K,oN,- *@aVv#5f[ .)<zqz>@o3U!~$ɭ }l(%WGdJ"\b*8BL*5JͤR3L*5JͤR3TTj&In nR Tj&I2IfR'k&5JճIfRTj&IfRTj&ʓj&IfRTj&m5JͤR3L*5JͤR3`mM,Slrnn E6=gc?րKbCbGhBuC2DPoD>^f 7F;ld8[+t2nւ߿a7} ٱ/7 O d ~҉K)>2[!5p0ȤB!1rDcr Z~S{]N8թD uu;Og^  Ydfw>>6j.Z7gzjr !c$,B8ZQ8-J:<}Jj Ubn K,TF%L))ZyA:.rQ ăun%^cDu_2yttث7A&zM%lJbUw`_/'Y:vopWɸCG&G1p}^;ƍD+NƩz8it_Y6Ӛ4oKl}7}vr~ҳiP7^^n|s;5=i&Ks-~~ڿ&=?-TzBm}X<(ວz{ GEz5.D][; _gzܺCA蒝?Ӌnno?`lQp޶c3`v#\ݍ܍20exHY=Z662!?o1(c eKN{6l^ĭDmH*-:y%ɅT Ykäeؑ9Xwvd.ผqvN[\AO`fq=2CY>*YMSg«ڂ5iђڏB!HpKLIA(UBH>kDC|+ II+ ,d\儌rJ^YEt҈@{PL~`9ٖv>Rr&s9x'SA3e,ͭ&{-а 菏>bX O*Vxq!h.aRbK-J UTf`NN-"lKk%"HI -\m~^qObm} ,dz靱Ol`` E'椵9-QDt rH y&rN39Ŗ%UB|L (fSj5 L&nǬ9bfq c;N=tjW{6LA\S:;Q #x6dNX(.8#apYic1dBEdkDE-PP8|HY%TCgr;wÎԯqø/q0"-Z$ $z٣0RA6yc]R xH" #XΘ6@H6\&";RPF#=k,2;w[HS.ڥr퓯s0).v[=Zd琕2!%- x1Hmٴ5`ZI.>]Pұ/p|_`֥{2/fs<ȭ63w\Qtq[5xJ5ZׁE8%[1"h"LZZ9 /@OHN}Jˀ(XJm Sy %.smDa>d -%Sܲaa/ӅPZ C"o KǜU.$x.A9Ceښ~qŸeuӣÞiDNν3F'#c]T0xb`wRJיg/ ߐu`ڲ*E|yL(Q2G0V&C&sCZN׵>vU *'Ѹ*eF(R&} \vݍUeNDڤ-h2;1V]LbZQhQ1By9Y hm1_ם9F</ޭtvAB6|ޥzy';ɷXOKޖ@I_3z_INN;EoŲN=lst268V Z+Yh֭ΊW:RV$>?NcyWn't᜿fǺ*moy&}dqcǿ_߼z_{~w~}Å}^h 9ƒ wꀐ'M˟ok[5--lѴCz¯{_J\G,Gqu,MY,snlz\^,X#$5i6IZˎC\>mX1D$\ vx@N&[ab=`\b_oURwUz6wRk|Y'O}.m&~TOw8ymNu),lb " IGe hIbUAw޾ݝh\p<]\AW o@-~4~*>͋Z?u<IFyNZقg >etd\e5ρĬfGf59 ZA0*E DΎ{TgV PKc8 *"Am*ŅXwf:2p!ST,6f)6L桁`ݹ]B{Ď%ztIg.h?t hvEB}#IAp#v d vpzcpHY:#3$E%%JșfuOWW_iwy7^F=Wބbc 3py[wgGivLֶ~]*Gָ|8*.vӾjg/@ rH#CDꥦ@w0|:H9a]8m=-ci5joݵ6ƱZD &$R@M>jb iJĨ6xΨjgoc$O|8v6΅y$FhvD7?n kC893:r;Q(2< |R?~b>BwOY R?D6!e Ns*@p-3'-(h%sV'F*"L(BW$E<"|%IQ}KFD)4 ܩ`j9Iђ\|t苪Do衔DN]㯠& .*{뿒_ySp^dd\4ܳ6Tӟ҄) T\A f2>+F5Bl-c@u҆u#N[x[ Qi\`R2s310R)Q$; ƚ+=yBtc{c0Av% qV3ꍑnk쑶(^֖(rph΂'I$#q)"DŽϖؽd:\{@p q̥g`m<)&!y'#bK3k!\S8"$njIJ"s!D !V{@l'Ql$Ja1*0pK !!G Io-+HkUAT`%R")K'aPB: il*`U7\licS](K[Ҡ*s$ZF/LRbbD<< @M N鴔fsJbZ)@<<}nm73Y i]nL08 sQ)G/\*'v94`X̵Q*sɭ`)$wpR?4"0]0((V3b &mN;ōT)" 1^Ҩ$LG?8QB=o).hn> 3OHn;tHPD#d8L0ǣa `EmI 9Zr;m>i @Hfh!].ǭiqh<53^TJëIull$oQlܬ*<ٕ53e ʿK{[ QխšInS۠<%P:US|ZzsНJge6os{vsoVBJhݾM{y+OV[ԼPrxņ#mf̃A.on֧RqM׸h7pK8M!ITo7<ǧC!_w']\VhJ8NbLS4w3иJ&TR吿jy;/iod/͆/6 o^gSk8۫%Ys[VoOi2{ _P)(g:7 61>UeنŘb]8n+KRPp[2c!Yȼ,/d:CJ0DʹrDZƔC#'&>""=G>DO(iQ;KⴑbтwQ qUcokfU=t r>Ot2 nC:8m1؂OHSi|$(FJg:b(J-(Uv5*kn@QAsgl#R*:dڦwnK5x%zzV?3{OV:{ڧz1j?ʣꅤ#}\E߂hPSn9V 5 y{==^xVri@gÊ'Ty,2EGUXsjvyބ`kzp׆-2Ok ?,#OӸࣃ1J!x#Zi]6b'QdsBZ)5<(mRl~`&)Ri73gf?tt*Pc,) D$,>?aM[& (v[3ؚ&gGgDJqSn&cYL!SI6>:^=V%ށ-lZLBs 8ޞ:p!53.e 38۬ݍEHcƝa #$R"#1 X|D0鈳 ۠"`Ã"ODPGÒE* y Jieh/e:8 XZYh6uقzY9ZƝ^E Hޝmq 8MխP%JlhbM{ybG?\`ќRlb FD-x%8+K2;ӎ)F3ߙ<d`|"Oc`pqQ4J`Dl'I*  Mq~RMӻj1gZfguY}.R͙f!.q2M?!2ctϊ4@bt̤ Yo\b8_M/nH ݚ.K6VuBbkR"f>FW'bhÇ!1=.1fYm⫘'^J**W~/uqT{?鶺ˉu40N,ap)tûNK"?~BFmUyDr͵;n/b\?{|.C/A\bSqXN>~6FZ=wՓŧMGQ~75ܸ^W?*١ *<~ ߾ۻȻp>짡<=y5kÂ,N<6X%fu-\o+D6}~R׮cTL UX5o$d7Lyy}̋$Ҋ6U8b.&<T#ST@cܣK0iʧ뿛8|>.w,Sj)DXڱ] FzYvfl< X׭!#[A[m&8 픷1r:LjμՖ sMw;.,,#$!X7օ.T\3sg>ns.(Mt>)^yl\b@#JX>QbL9i2}_H=x6Z, ~g7Gf'8HMJjxIS\3CtuFFy^$oml+n_P-½UOT%Zԝn<AÞ{;@I͛"+r^Twtfo.SV ºW͛jdTO#C ^htW¨L ahJWy{!|O/$ј /$[@|==o|1LgL{84W~VYyagƋ53+gcyJ(ᵽoT7Gdb[ :]odՓD)B 74kMsM/ Rz?Vs2? N+:ɪ/c뽷7/q){6\6xַ# io`zeFG,畺rq79No#oOk#-᥿}VnO!I>[}Wy;r{ :Jj] j;Ɲ1VNG5}VFk_;t j-:/շ.<1ύ)fNGm^x`m9H?G5e'>aI|A}PT1/HV>17ebLA@x׼[,7ǟ/5L`βԏI}?}:ǖA%]ʆޤ zܑIer4MY;Y.$L<]y&w[ӯˇ(H9[֞GHߜ!zſ%Nj\I!FBv/rLVv&IXmJYm ֩zC>iQc](SZ**SUҪ4ݣOaUR_ƽ֚7nNk8^kѵAǨͮ\N?}g~ޭ-5+|BJMU6,PRym&5!zp- wO7vz YLf cWIx34fEE^3dh!SWgGK8FV',C똕2Wva(( oR!TR2玡B4 !FW tt}2ޡW"NG/D 0^#6h?8vɂ8Tj'=u(s.lbqMoԜI$ͪ"hj*uoD $Y1ڢDwAGI,gXc[%c3q6}FVh4V\ՐRI'yh݈S}h5ST:8S׾"fnJ3HT'.Y{V RBv$JkCvYvӌT"yˌwh|z`rNc-OXUAEE'N@kah:^vNi2Qh?(QLnBWˮRWbAme±&6V:D%р`64ka7Wg\C\-tQ!K!LQLN0/GRga=g,*T*Jvdh/[Et&J2ߋ؎~B܂n]/G֮z1/EV̺HN:ѳHGQE!i$% !ĄmB}0|g;V%c:\?H 5R]Z01սu"AAB00 ƛC G\MJ)h2T=$V*U2(yCs ~X=3Ԃ ʃVsF<@E&rZ5ȼ`|ڄLkptqX}9zOC"AɣJA&ԭ.Hkq66{XTu~V% 9ՠhKP-W1hFymC5YK1Z1pߩicUsYayp̝y'K =`Ye>ZbM=A!%xB>h AJ._68zִ/atԘ jcJY%PKc[ AK^ Bkڝ2֐h ΪmB[viЉx!( g@ @Ae#kvpqAkOl: EBiv%&HTT@PqkUE¬2,,Hc;&gو.BPDUb|؀JZ!gѝm?XxTF+Ѐʬ$޲[6yKKުh{9O i ߬n$BG lӨ{tВ0LЍ![[`M.O1ڹWWvק0\ mS0ަI*:Lմi<n=CnNN"ih$timl% f {hwGSIlf1hmTkM!JKBq$RVO ]j31ig0͸h؈-g' <%2`Crh[SP.O(7"fh!?tHuR"˕,.TP=ʠUbFz DekQ͛] bV$⤩MkurSpMr;XYy/aèIP(EeQAa6:XLNg= U4+~(]IڈJ5(Zд++4)hc37 3E5kփ*M3| ɺyfj)څ'k=Dw *U'$f֫YW7@u6V@P8U8M yMf@ beRt_HO34%7aFD5>d8N֞stކ`ݬ>jE#VnDL0r( ;ff5ԤU|IF i:ts't\- &!K#rN&u+<"B.8! R G^ w't,n>=n5B{+ LPy CTr ׏;tg[pqP(mA#dKg|64&r E#KQQ(PDDQ: k}$tIbI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b%Q="`jo"k{$Y$@_# a @L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L}$O$f* db/L R{&FbI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bKzTgHM œ@F3 5@aI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bIh?ZՏsBRチvrzuryܰ%Q-V^]}ZE:ѻNgN2)W'ŘYѷi{R(?ׯ|.NK,-nm3\gK}v ޷ᗊ`Cc-̚~?5B8_O Mj|ǣ3oK.gi?(37!.=[YrjZ %g  YZp &E7@EMXp뗆 fcz׏\Km徶>IM(H-帥FeegvOW6 w}:=Ωۡi`?}n3@+>Գ4UrH9D'\6Q76Ix ͺs||{16zsYwXD_޽;Vfoe`4zc?YnwZZO_f~8Fd,LCt??>$/-pV-ZԠj_$N1hɯ1qɿǚ(FNid=/S[&(˓zK?[B`?}ђlfgh֝ۙOӬf9gƨyz5N,?Z7nX{SVv}ԬvKMR^f[j~ٚmnfݩ]FYݖsnyfiŀѬ;Ţh֝\6RnハP[Ji ttl%´n=mX^m\ 鸝-~(k5wuTW\v2sZ1ErjfWD6q} g;{՝sG;Z}h{\_:?>|v 7&0qIJ0g?ϯݲy|Wx۠>p^kzESzrܭ2'2nB}[5V/hr'h6ƞeyb2v޵q$28P~`M'pYAGŘ")YY[4=2lS3ꪯ5Vkm<[_7/z#Bβ.r[) uOm6rlD'M8Y-œeӫ|7-?Jfnj6c9e^8Pb[8Z=Qr&eJ, m@FrG+Dpr gb|Ypv>rJlI}ˢu2TPVkt;wi4c`6^ړ L& BMZTu2&[]\H5p̕"*VZL!G)˓H=4MEKl=(INI,2su0gKɕarm՛F٭j#rbz8d+ϳk͆lAxBR:كÞ9qgE-azww\7zغ`uĜ3U {4)zr2 4ޘʅ`!+>WV2I.&T'9W+="Jhi.%FG% KP"µ',XfȺUq%3xCQD/`༷T,NH]rgW_5rolhDm^p{sB5 y덃WFLLIvR!Wy/\"\@JG 2-.ٽPݨAX wntWgs=̬VNv@V2v2=={ rrt7tDI^/qjrvoGO.Tsن/eoUn7-%'FZƉw.4Y ;N'D]F%ck얌ada[q,T-By* gO22>Lf>mﶺ¸W5OAzm'PCAd$OrR*kRˈ'37`D EZ>`x8 s=,MtP"Fж#&&,R.gVm[bA?"[c$OvK^T,<~uH4Xft&(k3nyBKRiTȐ\FY *( I@2 /8$Q ڹak얇Q?ѨcǡQ,I"$6V{c<\Lt ~>pC J5nP(-6UMJS`)}0(9ВfBs5SmմEnx{=Ū-cu%E۲\4'x[=ZJ'`&>F+-!qcZ8sm޴zA1F\| x4ζX=L>[iȬk:r7ElxǕc:{)Qka?3_y|k!10S.R((d%oz|@c w㩆p؃h"uFDRD2B0\h2s CڅDe05u1u<763s *B.ZGmiA9m3Jq!M&"q!W:俒0S %iԆMhy;H^ZnJg>&1(< j E,Z$g q M AU:^(CoR帺{j{7jyV(B6L6T8ؑMHe^hv}z*>L3yhb݇cQK\;U -$},(De(CH[wbga}OYH1A2hи#9DKXdRZdk`=(~~eiﱎTӀ?n/N] Lbޯ xbp[#@IR|f(/j|*ӟ{jߩ}.uD'⢘&s~wrnѡ! IFqU׾wש(;г8B :m`39lAn mwug?!βgrlpf_WKWggJrjyWu^pywţ ꊫU2\Bs(֞iֺ65vS8]}jv*8{Hh.v뾻-˹]q s;) -4ьa"+fh$Rjpm-tbќMt0.ez8&h|*tc Dj=I,A6xOQUHK."$!6x[Y3gXp>{z!j;'aTPyߞgo|Ol_J%@/hK-9(N%EeS&j)` s,TOTMr-DbO>|!A ķ\$K"_眱IP2L. M(6D')6Uҽ3Ill(s!ANіsV^o'5FJZP=]T: UOUtW0WP9] z6;AREL!A.Tzn$ZN$ IBwP!zpF'Q9hW3~PE1z/z@*s1c|0fz.us_kw@d>+7Fh$d3Qm'͐n j2s*j"k6"Q9%_! Ay} ktqBxmX6ܯkT QKrڏO) YrsKge\*ˡfs .~-ƺ%.uu9m{~E~lb[γ`}γxWZn`ϼU>imy:T{z˙kt߳ Py}>F_N ?Ubs=MId]f)g7\o)->hL Z DieV8CRny]ޠ &h[sڒ"`?{1SrBhlO6ݨu$&58kYCl E 0@JysH(:#eyL1ZGBƩlP\}* 51)/, =pŭ6X[! t% FqP -NhR$J ' :tq .fy~bgk$\ ۈ$|h%`1pҥZpkԖ1 }R ae)ԶeGv@1Ɖ+X:5Θ@֢Z!)ą˽[b=AXV Y 3B 2HTrTң7^庾R).T.qYVʗ@3i sֆ,7Wf2NV9cO }HSn?uǙì)O ?eO֋ࣇ15HUyB{u# eZ# HƒdrUTMf¢Ŏ"&-Rbim)?M$S&(ˆ"2rr4vQzKh>NrwLޝ>K^9eo[>PSxYg/EH6i24* (V:{82>YXPr7{+gP Xa29k B$`OH;U!֍B%Ҥ"j!B.BpR$w 5g _}n"eHɇ-ӌOJu*qz-*Sѩ oJ,>}'VI'Y9)!4A"tA,.FP $CI&{SE°PA@TК8zfKE,َ`9#fg46hDd "gLWV9 o*  PT۲;7pqtFVK5x*Feh9Ȑ)YԀ+ɔA{kSNŃ#Xh"YdlVYB.\8 ߒ G_'ISǃ],x-_.U82^8HͩfO'$'&?k-w8xzN, l>LJa~}E;?~:^Ogֵ嗋a_,YںO7)%~mI0c\bviـnm[`լWHOF6'7tysI6wþ3+%λLJd6&Trf[˛~o7\6уjf?Ǔo?|@/dQi~y9f.{2lm*ݿD a|G:CJJtwTl4Ϋe3.4+&YFK +ukTFevXPl1M쬭C-7 Uro@`xB dJTnqb[U%M; )e:'1B$Lh.ȸVFP܆VtyB?] c} ei4SDZj04}VѳחMCGF nθXGXmwtcY פ',uDUrr\aՔ^CrO b3A=,؞,_f`Ǖrk퓍:5T;sΩEx(d;uL6Y:řv٘-!6*bd+NWD 6h\.j%x:ݼL HOZ?&e=  c.sʅR,KBu:Bm^AT`f3^{&]ct)_0e3V8h)y`dLI@qRP:y8-yEXT;勰HHT XN h)h4J_w:IS:EuЙ&h[; :{mxd Ea-jx6dvj͹z@/g9m/ =#0KwSu5=Z/aW5~CU/v(*''hADr8aBF4%@ݔt֑f0^rIeV^Ύ<1%F]*epg&59AX*Udʙc' 8`!&26LD&UlrR!CYYΪ5gK9kI{g<,m%.e%oI Y+ѐY&wz؈A/)\Չlx/5EKBO?G=wF9gLB khx5 1[ \*ɮV5NqF*&2̀aN-$s$*W㠎˪͉lG,,4dq(|o -R*R`'{i#Bmj@G*M]V6I5-1+òF*`ܮ+tBܒ{'+0քӡHGt`e/qbVGlC>i"+|tHMDZkS%8ЍRc#aj|m+8G'%\67>'Ҁ Eq)Aؾ=cvc['%;$(%R B}*Bj2$&! 同%j,8a .VuJHiXפʞ2"ҜVfi1 G2$l*}ѳʔ@] RCu/@ ĪI:tۮOdf*Zٮ]N1ZIjZWhDQЪZWok7ݭ-C4eQz,b:-f1nݛ -t1m[ 2À^_Z/}OUr4QNg4+/L.g$/fB?*`l2ߺ v;/רU6N7]v?|W8 #4Xhx&J,rBRGyB.;Ɉ(uvp»VXHC3NQ_TI7#hx}^cµR17Y$&X+J( dA1Ȼ5.GyG2(W'^)onܔmcfG _ovk]F%4]d"W=]koɕ+>%iavw#/#3%`P1)NsIj[vɾUuTթ>p-[qEҟ><`-7K0Zg̵?ؿ_}i_4+x֪o2z@M /bTKwlW^sX9|^:|YN\i7unigͭW]g/|u'o婖WBN6m=K7byvZLt5 ֵy22'ѕWut<J3{N/vӀ2{Y;}Ɲ˷8AW,;]E+# 滍%2?0tu'"Xk[csC^Y ]?vn5\rn I7TG8<_P!U;j~D8Hŷ!~c]묔w '4~B k96~l-)z(IAޤ &+hɬ,vtE&mQX YɄ/>3a7wڳ|7}=k6_\I!FB'o0^䘬VLv$6F%6p=fUNд1{.h-;֧*U+ڬ*ե0*FZ)nϷl&)T0nJk}k@*Uε`04IfԽs:E GE neyrB0K[[jVn›m0YZl&&MjBp- wO×2:`0vtB趸ѹk3SD pzG>;,68@;IM -ڒBB+V ƬVh4V\h]z#xKdjS6z\ FQͨ'餞.]]&H=uk,R rSȘgE¥:jtALڳRȮPB&A եfQ#2L3RQ/3>j}Mk\4ȔSk|Fmx’Wl;8 u\vNq2Qh? QLnWˮRWaIme&6b ]c":i6 ą5f;uF5`k˺ CBS QY:֞2PVPB QV߭A[Et&J3؎~Bނn] AA\J}2ɮ3Jt"T]zDzdgS dXd"zbJ˲f@57J 7 c:Zom@ 1S$R #n;clQȌ:B*< 許Y7i;"p`L ӌ/%RcYLu>% ZuT&t/Yi{ h*3[!(Q.(ʨqƂgH^Q@$6$( ^i;+Qa4uYI߭&So4 ՔNqٍJ7(g49Xvۋ~)4cMDq҉5rULl^v')A"/ڄ{aV9 v&.l;i>ykW e] lCDB$a-7T476O[Бf%SdHWg4Z*6 TPPj7z/3~=h5'@i\@2QӪAUQ Y0T⸱c4@^BB&>yt_V 5ȤuՅ:vTl.f,XN5~T}%y E*ΊfG6Z/kСw O7VE`NG]@,M$C.2T(ʈ>bM=A%xB>h P _@10 9B5Ԙ R{pUf}J`@;뒄 X>6"}IaFI*ျ!(A/Q[t2܂D{=iPʕ,]T*de@e!5@J̨IO!Ƞ rwc-XߴUȰlEVϮ';lE]Q("Ndviknw(N_*IV("ZT1"&7cbw ;^?A![sO+AUE :csjo6i]YqF+kҪTg $u20Kfj)څҵuZO;z*DES;-j-*m}HL)=|=rk.Δif Bft Ql:Q?!?@UidJ Qڽc T%0CJQ DrJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%׫1))ɊN~J X+F%ȓb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J 蟒ȻQy`d@~J X+F%P YV@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+Z% OI OF D'N Xj+JY * (b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}=J#Yܹޫ_VSjolVW}ey8nn$> S\Ϗ>-*-j^tI>8$NXn1YEn}y ۓc~tY'0|^Vr1;k#=e#eZȭaaQqϳ)ͣWlu*G3@'bUfJ$,:fQWSȸJ*rH]Q9 Yfc-z]z &L*2p lADHУZX׬_eD3*/"gw PNيxH7/ޥ#\ZKʱog\.O~߾H6I#Уz)|o_ާه[+Ҷ7<7֥L@ `fH(Qi:kT=U%@KXB̍]h҅~{x0Wf/ѡ7&wMq3N?8o ئv=6>;;=N6&VGc^32iX%5YmtgDw;ة&Я^֡|q뜫1 ګ~-Sn v}[.7vkƭdqq|@zat(ߝ]fӾk iaa[Gfi ^ %Yz N!!Uvx2 {4AGw3C~AQ!+YN\9bsi\qCT޻=tbZny|qDhl;>};?==@M~_WՍߡ[]ߵOCO%7J]sjb gh[ɼTe _:p,.iacmAm4w%xt!"pػEU=٤AM,~KaSK'算A_-,Fl=7 6ӋqS}~ 0fMNqqmhз; p{ o޵q$/{H~9'{  atuWK\SbW=|9M5%-p1UW]U= R^ St8ir}&yُ@;R]Iۦ% TB Ks3n uVԉ.p^;v^ aÐrK~7/}K v }Qj<࠭D9s,iMHÉd0mw.]~n~Oޏ7_~,}- Sv/g;Εܛa8t.l[d7?^*=>,~pru k@ Q@ywVPgZP"s%=\ q5q:4x%7%c*? 8@ LZQ#,=ҍk'S-Xf\ʎ:@5OFJT[ RhmѸ"e} v<}!.?njf'SPP;eT` 6bY}$ih3)KR.^( WԸNR>LlݮݴⲥLC>_яɾ^;WniXI.NalJ+#%T5h6V3mSèWHHtԆ4ɤ ,!  EL蝷hdƪFAZwtņZZG2SZ_zJG&IÎ'o|fr5]mu1M5Ï^v(KdH D eʗ$+σ&kn5]E~y)(}Bdڡֆ^R5`!s6RUӱ@֛:*UFzV;;YWI޾6AOK;@R& 8Rb6dc9= p\*5btdJ T?_s˿JA:kBqΔHpŽJAEFīzU jI=OZ+qo`y*d)8w pJismuv>ZZ=Q~k8jL5~L/cQXfx"Ҧʁ"AP52dR2OI3}׶zoE/zvWYV.;.r4SrpBڲlY$&l"`ɜ^O?R0k%Cq "43aus8+p;sQZ(4xJ;V E /trsϕ}A[$\/QHi).qLU9,67<||e:fGtvZKkT*;$,B%5Q;M’˒ʩ*Z{孢K%.QhZ2H߼|̷̓PfB6|g7=Y./EzfMѼQ>jAEolT`@ѰL?D%l6x`V0%oPR r=sF>kiP D˪Q< Z#BHb!;fFyV*1W43J%M Onen_{LG;{:};rQ<|:F{#k}%TﻣJ;yه-w/`^*fTGOm?vln׭n8RR]_x 9NHe2ȔAPr,S2d&rݒpq>r :|w 0{y (^޶Y(i/*pC]JF4>X1p|..? ? l.z Ir5 F08¦ A'asVF_Oܹtpu;D@wI7A`̕պj[/ mNc$EI`?9r/>3sn|1=^}&r i¨d>7u~RyDR$*!$ @~U96ʏ@>y%t(fr"KuNȘ,ő0\D/HɺE,5QJ1eɶ$PS +\d˜x ^)Y*jw~>&`~$>ȳ~T(A%I3-6Zd 1xV'b+)dI'k7r@EP^HI!E1hfv,GMYܘJ ņ8mPcդXv` Lȥ(O5ً$:*t-zl.!eueml4WĐ rE,D@@T4u#( @:Hm]Zw-_=x,|kmeh{[$^y APXU2.B?yRH"ґA53f"nZI\&D.vDРVPV( A5CsW dgRYMJ]t]-c3 MR3d-g2f2Ҫ@@ tlЖM[=(&#Cţw֒c{y= +O"CA9]#w: _pFAp}E?:qx5J̣do280үWt2J ~0cg4 gcaE1kg8o>$/##h0&#\ d Q>ZJv9*훳Μ}~d`߼%&Nge|\2N#՚8%{c0{7ߖ!ˮ]&IVKـa+6neQmRK˫iǼ,WW)̯B_Q׿4XԨbtk@ޭὶ'I'WuWKePR`c| FYY#FoRnJĐSJǼYɹl7F25U". hMR 1L, Jr%B^zk^Zx}GV,5jDD z'HsA8=k9^B^gY-l'(UMmA)"5pسPxKw Kwf⣵(X X,KL6>sg.e.䐬'LszƘ)>g'< r!Xʶ)|T\p FPhoC[3V;;&2)W{n>: z6D-ob'&oAc`Jy<P--M]Je{UZ&\H֠VK qTv^ܒF ơhUᲮGѹVJ*0q. QO'QuB%`2UF$bL6f L8dp]2 t f^E !])%Z$d00bR";Y^2\ xb \%bQ.u"hu4Y(u29ReC6d(egsU1;>[n&?mi#fp5AXfp5.dp}ŒS$K2ߕ2.CU|?հzZ[۽>鞱j[$So g S2+[C7R'(6Lp%'2 /`t}28oV:0(=@dZju%0}tM2o۝"E8:nެg/0-6zpZC%oW-ZWn,?j nhvw\=ӡfb=)֊x:}{ZQ֘w8uSuqwW7٧V,jv</Vp$_zrG8V1ђd%tՌnFf6WRĄIkh,|2]&zvb8opJu*jXB{nuVq#aMɴ]_~A,>'|0Y߱cCq<ԐLxpy{ww|כʝaw]}>Y{lj}n(7tj>A Qgt2VqC9#Sg ?{۶ a, y?0voorrۢIF0O[,$ I=-ɲEۊͦir8Ι3aՀ;^s%%3 2sS eL?0]_xn?>|zʥ 3> N. ܻ8ѯշy@4פQZ>nN>tJnw |V}k2LsyN-z _)Վ0 u@)xJxJwc<"Eq4*+ )ĥY&滂9ܜ4NA:obbMy(X?.c6pb}RԚ^sr_;9'C:5Hd] Kja-A5"өu:]t>xi{ ?w0/ӊT$Zq^^`):!" XB6X# #AP$[fC n'dI J}$L ˢmytaM"P* gGOx kQ:2̅fnJޡRFplE2r<$ oL=jdI:B쒒2do/JAZ`z:0$Ꝣ޹(Ѝ'Ľ3< ]%tubp:4>|.c-Tq؍̵ 0'E"h⡊bMCsܪcU/`pM8)Ƥ(`L9OVJهP *p™UTH>DfXJ;࿠}*?;T63nEو#LY0|'ަ@6:Z=5Si.FsUR578oUyzgدoY} + DemXusԹFmZ-SIfS[?.&mb Ρtt^^_=tgIkW:v}oBJi7v]^yZʓ5ϔ\?Wo&c~lsYW~MDWD}w kvME7ݹr}.e8$y~T٥= L8ArU8#&`33Y#ÚR?S8#-?9bWTl˓`T,pDs|6BEǝyz˥BQjPNެ̡a9$$V8m0RS-71F"O=M7:k9 0g8!gQ05N!lJ1#&,ޞ9 aN)'ٌg;Ț,× :&,w1 <+,>ˊA8ʎaR\N?>!A8 KO?e35?_ky].Zퟌυ/e=~Lx5y^Mxf*+|5a4|']N).k@_DNw W3|Nݒ%=Xqrݿeȋ%aGB'uϊOAY6N+QZk. !@bCxxJόS)T3:)7"˻AJ>%A},}$+A,Ҏ-(h)V'kF2/L(B _<"|)W}mp "ˉUh/"u𵉔, Uڪu`HIW_JNn:O/^dh_eE̪Yqv1*GM Q+0Xru\^ vYX#ϋFθɂ H-`T F*%#1-0WQ:p˄Nة c648Yz(dᣈ&VDf Kmp,Z0A̋|LK"H waK$GJH^"B6xȹ1QMKc5o>[ykjwSƘވVr=%D(\äUQ5'TQ1\y祉[!˱:6dZ&RFڥYKBXQ.4$F3JL)In##cA@Ik&đ#wgIVmyIY= IixK,(x RL }S(b-h6mX[.H D)"cZ$`"Rr G6(F[CNX B8FSVE#Mk^y줷Q5ݢT@Zn K)a9 JhG!5!AWZΩyZ{Ė6[a ai} ǰ $!A*U2I^PĈx98r)͈5L WO;B,5bG⠛ /=:1Gq>C7bXF Qj8ט<" Hk""W=r[qg ?T4ύS)+q "&CVHg"(Y8K ,DD 8 "&Z ()J~V0m9ڟCkV|-']"9nǮ_(`<FrN =ĮfM3~9/-z?| ҞIڻzGяT\k \hnz,m2fsBD#\V1 G)ёb@`>R"tYmHPnTj,Y"yeci1E{)) (nȝb΂wDðqaXs6ଷ޶`{WOOzYwgOSuk+ufuF5/ջ\><-Q ;4Yp.0hNCJl)@#T HES%>V4xљoUJjǦy 2r 0>PC10XD8(fJ%0HS6jF4OuӸ?]`h$wD(?En(%! #iƱR+zyl#ERGwFZoLw0fyhBf7Mkԍ@QӖ]^1-l/5ēׅVjGƥ"* v^XVlC`DK"4W41´jj6S\GVQ5#/Sq* 7q oD҃h[wWa Yi >fcq2,pBnLϗ@:B:1:Lwg&tz?< lpYgT`httJ٫mpC@ Ӄ2QV5Nm`u!WuWȫIUW' B`%[Ęe/>p'ZbQK-;/RpU)JޡMQ'jmWVq1o5=U~9aNHk6q/_Jq4A\92J#"va;X+\$J!M4֜'h|NCZU)G}O%Zyu 5b6 1 .BnL\MJXYe${ O%UUT}܄[jd!=m,eCjfK/L)7:=ͫ_eXK㍨p\q펽l~ ~7` ՞mhS^;ڬ^L'he'/̂[m4?tE]BaVm>ޮ[w~jԴa}L`*mF_me^=JzʌқFpdYb< 8E&5@p5zi]jUQy2.$T dVE~>wqV~f-FN%7}GD~p0Hr& wr$Uib j?Q`)вG}VfkXpKN E yAEeA9s9c7߄k]XN4Bzĝn9tIi6rg8FZIt)MLBb"\Uh KqW_{8܉KZG)h> v=]ڠ$:H.}mk=O~$>}2+GuW&M<)T֣4a%aEpSw+>>ėoFaG"&!(Z]; m""˾ʣ(`T F*%#\EL L2-Z3J SB-nXTnT~"|vCAc+Ac8b’q,K#`r&hyS,=|m‰ۼ5mV;cGdh2Xbip "DM7{,A:.*櫾qJP:5LZK]sBc͕ڈ^LYM 4 4H4,|r_ .Ĩ`F))EsDۺF*ƨ KAThªqiۧ;mTc錜e5qDJX$E`)LJHG!5!AVZΩyR{Ė6{f9>/1y$! \ɨ$2zA5f##98"T͈uh WSα$loQ2/Augzgr$1~e޺9W5Jr 4 X̵Q*sɭ`)$Ť?prM{wߕwnx=.i(T5s!@istP)nJ FFeeNl)0W逰ʒ8rVk/;{-)UcP9;AHǴǞwasS#eM \dsz斄ouS 尿 nMjGCʱGG 1<:#s^Ô_*8o| &ctH^r6$i ARG iV%( ]tf ?@{nUێOAa" Qv/w9Dʛ_3m Z<@!OE(9K1 0ȕ{7>}m(^b%~~ck9ns% TeARIո K'rBZS7ԽT R e`1\>EϢDhisioH`hy$-9>{rge!!hZIIaYtp Gg֔ ,Rry풿 x$ )[X1c¦e4zl5xBZgtSAB[ʢ1&m\XAsS"E%-EB@3d[rᝫrq:"vI1MקًE"p tJ'(/Q,I9sJtga I9U1 Ϋ:bz>3>ET*VtQhUbBS:u:?cm0c'e8͚aӎg+ IP*Ks>aɬ_{ )7!, '\y:1$ +k^D7o ] +H2kTTDw3 7eYUbU ./Z= [od)]~ig {Aw׹'E~643jz+Tɏ4M%|k)<>NnTWbYd3l6^61kV%۫ٴ2u/fE.Y+ `uZJ{;W>AOlժWK*7#fYo{}e/MrϙC8<宰<'c&R;wgrR}ܾTn_*/<Q=pr3{ T8J\(C-+&qa4@lxpT ~XHEJ-,CD{R@.KI !wK9 Q u댜iYo{ p{NƧO$.vW19O[%;ûSSRś-SS.8O,x8xa4!F%[lhXcQj "a$^ ")N^t#x&EVK/Md`|"Oc`pqQ4Z`DlL`a~w";K]M]&.ƥ" ,C/(#y c `#2X G.IHujuQ#.1h9E\'pC6JD-=X^N{&bx p\5yZamJUm6x7)d[ e"ѡUb]4Ͻ0)gT:EѼ )qNEq4j;f)@o4_eUTp`C!WM-WW^?}U5k:TB-9?c5R<[}_$=^42?(RpU3O=ƨem8mAέoDҙڭީ4";oWcQߟnK]B Sy@bH<;%alquR/Ӕ^)Ǧ80 gs}o8ߴfI5 LR*evJCegƼMIԕ4i&\N^5?r" y0@3r~ųK>:#9&UYWXާ{z坳rJX*Qb%> uR R; +<.|Oܰg$RO0q 5E%,-/iژ o`Dy^y>יﶳqwv^I]|k崪;!}m:ƅG/76 vb>gJVⓥbmra" rڇr%e)+=/V!UǜbO(ǒ"mOZP*ߜڒe/5ZHQI|T(, LԀ1)mWmސu mZ m(B0%9N Y9@ȥIx 䂤 xMpb3؊ 7%=|>?\e&Cz"er90i37bT L .D El8bmCojv_b=c9eb|$11iM:c0JM[2PnK6d\]JexP4%GTsqD/W5_`5c6j?. Ԭ;jCAhy~Z vt$oVʝ|W>?Ki⎒^~FuR\h@IxBQdC, b1%YC h _]Lf afw9igL$x# ֎PL2,pΖ0zc+NHQBLMK:.ZFV+Gz&%UW殮TYlg^}'5[ ;n]HBRBBJ&Pt&bl)-v1H.0d<; *V*R~)i7O-^ƈ!XA)R(_(=*M.NDWxM j5Zi>xVʻeƻ *Xࠠw re |!(`h^Z.k&lGL+DRV ^J@ *1XE)@3` sжvu5_ll|-C]6I_\IމkLR[QZ ld+>Wa /39G;X.^v4m7=:AgubMarx%]jk GD >Y$ &OlRIDIRggԈ9H:Oay jP~sG7PAqj^zg?xU(PThIScJG)J"kr;5Oq{p'g](OV[Ngd.QLűOJy3S8Bk<az ԵH]QJn2p'^isK>;2-!suTI}w?VBHb0 E}WVnز{ mid+4}?Χ{)FVޭ.o'm}_vc&Rdݳz!MBkC \϶6G1pF'F >}hiVFAm/Ϳ'k joˤVvPSϯuO,j}=?]?'=?Tzm}^|Tp|w||G++q!*kPލٽy{Q`DD>}хa]na)) t8!N)bQhm\+z'kr&^M쥶. J*d# >T+Ovw&jZ'ٶbN!F*"god)CC:硱[Yw6\dSlq֑xȭAD2RRLbȄ1|0keYꘗXbSE%+M܂(>.kF)$ad% 48D¥Ɗڬ;Dx*[^ g{CLNL+m\t~vZ˶q]erޗX'ߩO;[Uo?٦VY2O fm09:(/)F.Gm-c݊sQT`DHNۺK[@ٕR6@Rc(lZ[fݹ2*la+ƶPp%VŘY_݀GD.fi>]LN>MƳ d)Ȇȑ(Q2s2muf DATv} SF@S uC%{!+1*Pl`4vzS [ݬ;[x2W(Xfұ/V V{@wS&/2O5ū RgX,yl.a!%4mpERtZ"3dFE _lkD%Hq8!梳HLɆ;u~{ؓeP\5bǶq a8XĻ$EHYИxd{:hS@B$泑uEN a,27& }1VBJ31 i\J̤Z5ͺsE vq,dOfR]`h[HibYbƜR݇,)+u.Z# Ӥ..t=2L=9r$c{ ,( '~~R5e՟$P,;4X-vE͍z-a.+ ܪ/IEDJ5 ]|pE:jt<JȖ2BK'^a*tJ[+TJd@CFPdc,ԠְYw6R\YWqYf=9:\Z1lBdCڐ2zG ] މPK 2ېྰ+VoLn+W䤬!e:e1bĀɎNY)-kqoUޟ`/:P}Glؗ{'^SM?)fbBonc9xdqR$P-o3nv1M?Դ"̓$"aE* *ȚIx> OAbr4#iǢ5jUH:lv z[C"b KscQikQ]F7wzOhtW?< =u`@2?9I^UF#N>gTؒ\~bHEUf!џ_Ulſհ<\ _og,=n,*}hYxjtLSF()$up*l` ft'93d m&aunrzJMRwIȔGa4O#]NCoH.GG/˿.[WǧWu턴P~v6\Gaz35W(Kwktyhqyu񟽛Ɤ3?'^?\xs}k:s>e2>[_EmߢCwBӿkZa׶57~m3W-+bIox(֙d:XkC\۪w|u{V^ZU;-u6Y|~<9K3_eT=@Nv6-n6.N~?O?ϷoK޿ozͳΟxj5]0B.u@7Mϛ7o5-4mT˧&_%'㴚zAx7ixjr_}_g/f ~ٜQ^~vֺܳ.on4j"֌Ə7Հ_?ބ8uk?UI|cɑ$b9O*h>:EQ('R,ڢ\pRڹbxc;eʂ O$}IΧbWb,A"K¬6;Ͼʍ0@C 6|琡d 2 yb,*-E(A 9a-֐2PbV m2Ԯmvm29ۻtW&ګ)colDH vW K9hٗbEE#0C EP3R(˿y[VݏEuwB*Q$}eD㭕@ॗO6:&M1ľ*OlN~߯Nr۽*0-KPD}/6 kt>GQY|QJd!&SԀ$d Dy/Ky:|~oM.<^9\C-v7OoC؝'yT|8>WuaBgNCVsa!N&~`5ρ(Ȭj;f5%+CF[P R:'U)^c(BƩœ4 ʌ d|ZK3@Y{4 wM'%;RQ1ں?+Fͺs?0i!-K:};dOqnݨ#J'< ivUknsiw\qM`O@:%B]e0&Qٯ-テO uUu;!ug'/vҾ/`\.AAp E]t1S(THbMInWB‘VžWgۥ,ݎv= /8m}-8{#&ū*!ZF,D!} .-3= h.@;P'0*՞[<8M W{܀T): q$2Í/֛ uM|!671MrIJoE()Q,s==UOUW?ŠA\:z|od.G,QrG.HbLJFtH0IcSİ|DRR6@f5G#Jo=5Ѣ  JFbP<.5-P.6GW!>=ϕrko?)Fc^Fm?bx||ԛf_ALW ~WZ^_ĆL?sVUz/yZ~Ԭf_]_rVswSC6qU0j5"'\zxyF̱J8G?>_d5GτD m тXz}QC8p-^l*חY3Sښ6֗rLSj[6<93?㴶:fVP$Mo ޠoqs"t>/J\^Ukx+mk3Z\X>Xs cr Q0Gt bwFy|'Y |I7"U&J3m~kW{a.F_o&0=a4SaLVR0,/ݲ׻mMc_ewyDjTAgiQW>_Vffıf3c#s$*"Ԛ?>om2ĜI)Lxq3J4|:OI)lC/T]'N'@pxҿX c7[/juCټZcz:hu*5IKi.$r^Qz Vqn$E5[*3n@!r&ZM.RȘj͘*Utubù[oY/@',q쭮=["]mSk]gl{ϒf^Yܳ |6Sj29^-I:ՠuERvVZ)iG uc={TW+y-,1QPØK40nB:D΢/Az ;봓u'fEu19nߤ7>gZ{/76}6t0X9%Ǚ&*iOTV:xх,( *<:۬ݏEȲ|hcg:ԈФ5NI)gJhϼEAIT4F,B8tuf iWīykqx1E$V՜h[c;,[FW )eAR)cxFYLO,K<t"8 f8r<{nk 1'EhD.qHz.َ֯ fe݄-:-(2kPE={0{ ^*aij/]//0{fGh;3~3KbYHqb͢GIpEtI> p#Y1.d-h5g(>T?FͩsKkAHZ(?o^ew~ >ToLy4?^?{ܙ_n~A8+ `s_9P'0񿣿øpFa<7 gޥt/Kav~c_9O'~p>9HqW>x9_u N%?QeD8U zMNPGÀgj7Q'ȏ?YיSTCtq0sevo 4OGr}C)ǙC1l~ ωe%9!Yޯ?oNzDT@,Zu.&JO n8z'|7O9gش\zoM^b˦Q 4lqJ BV^]fҡ#|qMg0 y#g!o]УNMjmٴ2ڭod:T~eKK~z\d(r),a+&K88oP w)wo/4*kKi(,ZJ޸No+{*_{u:<+$HAasK-mW\>ſ-b6wf(pTzoq]k}wl86{ҥz=hpCEva[^Т"3lSEyL.#>h/ntqo)Y Atq]eHu"*v:1ɜD.TR*@cQ&AzsYHϵ.CLʧ04/2{F|6.z,{pekU^ꘖSU]GcՉEVj)jo Q@>hZn2 Y6sjVXR τYN8fm6w_qcf$p,Ɛe(tBqN=d-758R^@Ati|ZU 4o"z:%9/qQ;A%QG*rF[cuTنPUAc\_)ה剮.z^jq["Zϗz2n˛;ʋ⪣".yDh@ Y B'lT@ in1=8bٱ#.*5h4QP0B3:3ZTzuQ= =29gQǩ&, fZ[•,) &\MNf,/gņ5z!y;9FufGХT<#{ :*'%-2&)H2a |Ÿ9NINJKN?[:pD(S**L;!N1VFU*lIRΨQ04:IX3^&Z&cSIrMEP }2jϨYA6˴L&tڄˀ F cB:иds.rf~Ӷr$֙'zz8WHDvU>SXٰJ&ykH 1WbTjuIFMSΝ#~% Y--ԠB;(+0VhG1 U;lab|Vco!&udERV S ]>2ӊ'\ݼB#[06 _L!R \Z$k+.EQHczGͭ,vv{TJ͎c)_kh-K1!CSA0 :n{:JZwr +q?o.VϺ Y)רL  AM'ng2ЅB+jKin!G:Rk^^rQ%o|']E7sMR1 I(Zt2Qzjc@(CRRƹϛv餭M$MzF 'dNYDS5z+؋0Y9Ӷp{>8 RSmKvUw..> ^L(`!=~)%:oL68 r++Kա^C+=jR/FG% KP"384`']+Md 1P- 18-(K$F)RYUi&Ξ*- S/D~H"ZH &$G } WIFbLT,hc 2bI6L&&i@DǍTy&&jKSޔ6mM=w)εb&lx-K5 OJDh ,h킥zFQ = a^6J49k[ bl|a9ٞF+*0%OAhô:iC єjP\nζSc=9dPhFd] X'aCo9k I , 5Q 0;3wau3h,ӴDypD,D`CxvTD*B\&tvtA%Ak,QaBhKxyX| N)>Ȥ3Ȗ|Un>C?? ~G{Xq**ZI촘 r+Ga\ $j4xBy2TW./r*~k~nav$;)-։lj8-Ћ Kz}q M{!Ag erŽFp .}<-fE Y)XOND1 ֋wō7!pV-gy`ŧT{C!u:y7ŕɴg_~Wk/nO/ 9}Nq*Mh(8?ȻJFgP9BK4-]4sx T?5w7^M ]aFu.̥~[nov!@Cmd\DpyaR%NƖ\ff`y1`X(܋ْYGWm.{m'V!7YQC[2oʱw͑ܰ('TH =Ű ί豺R̖JzvzW8Wg٫ߞQf^볷o^9C@pAEgg gp鿽qEkMc{{4XO=v W;0q@q W{fW>E[հP@^pXWPlnךW´MaX/bCou]V!1*,/[}e4I~xXw$GG3xH9ЉgG3;0S.e={ >0!yf[Oc+5-T0lv *d rQ-2+_BC"P5g5 S1O8KNPNMۚ8RGv:q6z7D߉YrS j|si{|ñDj=Id UHK."$!6L)z8L^XdJb +]roοy"ZIDKR*QGWԂTR46ebJtbie09 sJ+i(r ҅ΰg;N\xLiR j-bR J1eUAd-=XrzʼnR@B>L𼒓4FdcbB+/RLLe P˥A($҇}zlT8 "Y:MJ%E͙2LN MD')o{3I;Y;B^~IMі9+O#%-hW=4: MOUɤT03Pmt}NqC w2:g)$)JύD@$!NBs>^?,YRMVpZL%:[ YQ*s1c|09|ů5;E)q… #IC)ˀeH׀ZLZvGFS2';~ h]C!tJ%x8$W}=f[!Vef{չ۠CƼ%.ttٔ^=vQޛuXĒ3ֽwj-%0ޣ慒h8RrϼϘ+}VЭ.= So98ǟq(<G_N֯ضt'q+BloFi'iOF-:%%Q$y'വi+N^Pu(q{,8mIQL9R&! ʃOYEaA}$&K"mɅ =Hk`4O4:. jVɹ\WH}N*Kwd΃ZّF#YtH7l]?8$:;#~AGdՇ!ћYf_6 4kUo%VAԳi쪤ծT+s ?^Nek.I ڳU3(Mp7,39e[%94:rU6eYv֠74Z_erq:#uBK{7sJtіΪo~Vy>M.rmk2W/dHKj.,?m:I W FNk&=]Tz}9Ŏ )2Cu<0Cn!RMb^}To{!M}><'O`;6{TR0]5wo<SdP&Q&)?ki)zcL3ub xUU.ʻlWvZYve;KɷKE,w4>2CX0ܑM'&65fU;2ЎmE9zsqoȪva̽@ER݃t(b²j=Kt9wg_F4+} w]0kP<ŗmħTG|3 %=52?a@?= !^rδ5€g5#x(zCܰAjH)L@/{BPjV S\63(3rq8r>)|B_ϝ5/_t@dqԹtjV<]?OKPGi Dr 1B&XuADI\; 1$eK)*?"Q\XFz $Kd Ep1(T[b)r[q7 ǹy\{5ܾ>Ūw_1DϒC7Y3TL x`:Vktb3r+% *HM8x=UG :J^ Ce6 DB!$8uxiLekp))\Z3%IJJs7P!=#:j'? 4׉y4,F5t}EC w ^y'E|l  >LSsbHqyU1H]Fiv10'8q5c@qVNkCa`\$C6Hچ^l٢<^U̯aTkvPE%?iʃk">`% $u<I"G$|YAdrMp4s(}|LƮzql}H30n}l~Ч,Hy>eu297s4;?=0ѯ+Pm_}U]r*?N|6F*kvr_N]@ټ+&0~8~uf' ̓|D9bEHV)(3jǧgnv~b>$|:DlTLϖ3?;wK?Ǐ(>tqzsei春Ecq۟~5K.Vz#F0$ߒj+]V6\ECEGxqMgv&3FW&+D~<7闪avhӚod1(NW*5/myM)Qo"Rhb|U-p Wc2侱WmwFE 'O]9M{If tW.tW=r+u,YJ j},mjk_$yC/;kgDԹ]u6)9m4;~_qc,NzL}1/)?/f$;(Z;bwώæ{:pO2yFviGޛj֋g1ثՄ7ܴ0.콮zI!As( Tkܙ@R.BPckgAY'rNݝf}$M{&!I"ӗ7{ҝ_*Ou$S!( Ĥ !HAHt˥  ڦ¹@dfg}]hu#oMAws)ix1Ig)L';EDbt"[AO?7jiց3&(%BqGWQA`ک8W=OW P' ~Fx}\ Xb$aV R`'ag^&+L`RIrMAP>~Gˊ?.@.fZx6a FrDŽtu$4ds.rf~sr;"Hys?.?WE>(,|m.$os-O&I'11dT^+Qxhi ģxlXg}btaMЋE6Ny7'!u3d ģ4@" C3pO9m]8O4ZZ_RjɅE6ºڥ0 M ֽ&zԺWu/т#bsUB7\H80a. q$$kOt8a~LTkTT&GԈ gx .\Z[J%7X>r+ԵwYQ7 QI;/ne]w/fؖk?B>8OkR iAp`eu"S;Ld.'l{[ .o;kO޺#u+.Q!;D()ќ@BUy 򯖲u,^ş.&.VyoT}>j6!f~_rүtyqhir'=]=zڑ{w6 JO^_VepAf^_^GnXz 5X] PP%Xq:[f޵6v#rpcR$YC'3  LKDN,obmَMrmY䡪_]9٤@gim!,%vcΉk<=3<#),:aYdaHXc)QPBv.Z #Q=$ldh8'TJd 'tD  u=I爐1;/K"CVùʚ~r5^eJ?)dNNB8NFoS 3I;GQz8/m%xR̝:k_[M"9)gI9g49Arš=5x4HV[1?@Ҥ 6RgnGRVCLb?WTLLt^%+b( ax?.*mNڟgi"̋NFE ‰TΝ 10< YBʅi4Vsd 5UH'-IlKrhC"bIZhz~%`95Ysdu7SwW矓WڷZ6hxN'5U#t Tݔ M-IEY~*Zi{$8԰2\ _F۷up5DvjPF()jTZK0q_k˟h`*GX)FpH7W Q0lO?T)7*}*J ݄~"T?EwC%7 諯ΎO auw ؾjz)|k/!4p L5ƿaW.|}~vx:8x0αqm4)|n۵ dN:9|{& '-ƞnnͺJ99l~7b9ѳ>Gǣ`cJ!7YB{uVcԑ/'T>*]}ٝ̃ШxKeF.NX~wyoÏ퇯yA*?~x׼ Q@`|~>ovY߲kpjԋ Mb|)qc9J5nd+!o_OGT.݇qzw:৚I#jz+r;l=(uEZQ ='.Vt*Qa>ieF띓^}rD0Qh+\A:l'lWaZoY`yx}e/ l.pysW:~7WD0Ko|4}(YgcRm2(Ty3/I!픷.ۧg`4V߆$<_Aӹb77۝B 'VJϬc0^Mv&@W bCT(ID;5x5J˫Xit|2 6=J4dKDrJ`ϳ~'ZUUS~ڝ') e 6 Ӡh&aJ~R bR%lcV4k QἪ% Od ތ2d"$<F+jܭt8gҴIT26WLNTD3mYd3PM뵯/ɳA\~-Ud )_-{Ml3/՟KZfΥI ,ա4VyI)-\9ik-sQTEc+sr@p:Hi 6R-cmavd<ƶlՈ곝WdpY:6O>'G}fMlR˼[myj"Z%QesJފPi^JVlj뒅h!;+׆ƋOmITDZFZ nO&yL:j۝![rT'AFSm*k Q'"1G^Ku&X ^mBh$P{Ic[ *YE*1Eb\am~ym5ngw6"bQSPP) d9Xl8$-qOBl!umR J!Tnbف*Yi”ؓVt`}M*֕ 3vq/kbͤv7;xkD˺Bʰ1.:v ںi뤏 "I;vqk❭c[=|V[bC)rv1;aOo|ǵc:He?LLZKj\#6_yZAJ!}Qd@} CQ"7A>]zgKW:_mI|X6gw9AaP@LJ db=XIAjvuơuMpR$ᴫ|68+/S3 kEdc l8 %!R=rF90B]rZ!ъ̦4 㩚-Tlbu=vW]?Y+Y{&'7E7;=uoxQƭI7v@?uk-}w/u{ث#/ߣ(uKӓ(]-x~Q} S~E{/1Oذ ?._r/+7w卺F*ʖ M=w<hYRsR^~`_׺Cӳi=ί*调?G'Z*gOU( G*7.Ǔ)Cɤ?3q7~磹7cЫod]^{1ƿ@Pp]9"Vs-zTZ!JqSP)s#b'C4ʳ) /ŰCsSGKGmdB;pYU!w|hn6:d߻ 5Uܻ5m]yuXXe]9z `_u󳣓!6㶽Nw5IPxH>azx6Bk뤌dMU]QV,.B2*X_L2Y R[R|nX}bs~y/Ēa,2I@Dڡ/ `u%2&^'[F#l" $TY\oHYb=xE9 ћL9 V~:`I/VS.bN=uoPLJo0cY)ӱ-ұ); Ev]+ܓⱾycd5B[c~"?=tՖEϮc' p˷Ķ`2۳%|[brFZfF. B&#KM笕j眉]yΩ]h]hceMwp{D}[*C=䜅n(K}z?)Jc&"]H"٨SBR.١^F,R]3fduN.ǂk4U]hb;hpth:Z7#K+ `F0V0v/30<%nԱ`b5DɅGRwvUddd{+ڐW-%JPlZa,1Y5z!D9%h_˜ 4Nؚ QVVL(s |.\ҪMo:P/ wJvhrݎH{;tguC3Zy/na]i@ 3^.~%l=9^lVVm0nrZ[ n_'Η?B۫pߡO;OGe{<`] O_: ޮ;7gmMGMB.Ud OiJ˒p>FU)LFC|w'^979|r8R{iL1ERBT| 5r I] J%P,&ƾ4t,qn_QlW KcPѰG0_8{|}پh}7 \~]_Q|3a;ԟ6K(DXcL6x #x5UUg(]pdhSm@Ҹŗ\wmKu^{HV);z`!:-T ~44vKIR֓v{WG3vΏKS/هu>;C;K헚f %~x44mn0XbٯMlgw/+U>|n pgP{+~끺KKp`( ׁTV(#Wؚ>{>XjZeNg ]tB4NmZ ,M`H)-zBi2TFT-ZwҔq6"W&j9 4YIx@*rE{bORJ>*]oiAF #gIo%ZL%x1G4K52[ߒFb\S?x!xI]=Fj-ϋ#Y;叫e(Yǡgx~fKVP'0+bf'TGГKr>H1$?gIU}QKd1]`Ȏ6Z>Z{ں=ㄷlVjk@@#tkCVݨQ\lI-ϻ=T[g''gy8 PkdiLZi0ޓIE'pQݯWex!AV\O?֝F5O}1G/7flÛ%hݡunݭ[~<9_]G6eoCo~$-=,~{Eoc*%tizn~ÃׁnsO#o];vg8v͗T\گL+mc[j_i;h5-~={⼤H8eU^׵Wԭ/ZкvT;zn?-uZ_t%$#'ջSkwG^m2 |1Ng7"X˝t& +l ZNɫoiky:hc *?3 uZmG6UٝdLˆBo1>JC' n߯K &ggrɵ+:k:u&ۅXLqY4c==X^Mӷ2wroY,wm] SFij_.Nx٧[k{Kʋ\CKpd~wu{cJjg{vY8k=\Q#b.G{e`>MÁC>rד3R08s5B'}d_VdxJksZoUZK$YT6q9FLlY2jھf>mv^(|&,mh|Y?"h^ǭ&ZբBl 6NI⼶&(HIA ×yjo~_ *d z!9 ʘmp1rѪi4, N}=R$Ize43M[T& B*H^cqUl"I$Z2 5\Uycǔv{J z6w5899R RkD$k$ըBdI:qL*MB0'aZK-$BɌalJY klmntʪ4dQ}h9(+g0f tY!)!d*ҡn!36JÚTuLaafm|zr僵Kk޵YR~%|ml \B8bnкZDN$.ճ. |Rei˖o dHOɄlƟ+qڽk/Y55cU9$%9ʂ V4kZ)*!wn:>)0I ڍVqNj}  A?BJaiY!i%)hLaǸZa1zԸuQ\2CF&l.-6JR`ƨ1KŠePXYB*A+ѥ$HrDjA%IA? prTZTHk8n 5ȡĐ 9P.3{ $Q'%\0uFwPX&R4*(Mpz Dze%qabol3+O "(+M2P4UU;H 'Qy˰a;J1-Bj!(&#!,& G=gU0?y. mDRc̩Ul:c&a@8֨Q T)JK@s(%L?c&!n2L)h2L)h2L)h2L)h2L)h2L)h2L)h2L)h2L)h2L)h2L)h2@W7 [d|+@ٻ6$p#! ?沇X;w%OkjCvᐢ#H"2lYitOWU]_5}gVrL*qT @B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 LCCbr2p@ 9&&PRzL 8A&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!2HJu0L ] @ dd!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2^Vh`hfo~Kiu^6ۀ7!8[xQo6nIGtb,Qi' d0suύtMEf/E?5q7ӧ77ob1$> *t1i1;[=W< Mng ȅsjT4?3[%4*1Ĭ&!*I`!$]!9qlD]+5TFS|87po1h;0"q1jocIhcav"w}?]tD>=\_K˼unK.a\ λk 6/a:'ADrδš4XFH qb&LG)#1RU;_u՘޾ps\qi4Xĺ/z 9oٴ|2h/͖ xsd̈ή|uQ0^]h_e/ǣE[y] o|]O3Gtx6vVPlC:zCrL;_{9eEAÐ\1vn2g]ѭA lxy_K$Ҷ> .T$kp8#4kb:8͔$ε/)4R/XŹ}d[Q{k|' x:nɿw=)/QehqQRk8;mݓs7@WC;?)s ɭeׇ} y_cß /xxL(z,c=kX`q.DF=x:UNkpbr+%-D='ޣ#QH=`P2P$ag71&ȂSoAINJmRYس=bޚ0 ]];:gs* ?e'u51_b e)b/jc>6Z׌cZU1CVN+芬J_1!Fl$ l^'HnVN*j/G4w?6:E9xIs*l\NR^)iVfIF5ɸ5[3ͤc4wUd*w&Y`PYrzg:,rϞLRrFUJ/8ޑp!K:z^ZfX"˭a@hC֤d9+#㉕3%t` @L@%W^ C%hzkÆѣw rH>?Y%h{wn>N՗/i:4 < "SOOZ̗#Ԛ=T4J, ƅDEӏOMOAYOqy.gp%~i.(Pr-g&D 'q mz%iowUrT!CֶPUSeWۖ]Zx2叟&'m['?]<:O'nLa?㿹IsAsӳ:[rtH?^S'!̝뱪ti{` {1m'};(1KwɦV9'c??tknlŐ|C |b ~]ysI·C]G3F L1qI;+UсnUmy#u^REҦ޿ޮ뻌F! ]0,WחlVfL\* Cz'A>Ӥd[W@Nb3~&4nW pX>rsL8 lҢLF7"Ob(mp/rޣë%f۸8?C%cMjTz8=qJy13O܄Q 轧EQڡX={e+^|&$^a]ԖzXx 픷#YSEqG{r6U,;.l, F<]ijgV&w?ƒhb>q)% -b6+ƼB1e~1C+0L9(FW$0jZN}ׅd8ra:abo/<~ߴg\7BU+N}7,`#@V'U IdMwپ'$!Cdyd[}J=N:©i6!&f#C$VF!-$j #AVMg}m]𮯯ލȇ/ݾY-m{G3U&ICD.s҅i>9HZҨ)ŭ+7xޫogﭗ2 uBN1F̆L' 9 )*J^ 0ޡGuW7JՖWQNlYOIN"r-L,v{g5V's,#ʔTQr:U$SV{Ke%S1cKSSB){kÀ=Ã:eSgb! q\<|.F[WKrYW<՗[Wʤh@| zO5Fc @LST;7 tV$:4wR"g^ncJf.&+I:TP pF8VEA$ݫ9]3US#4 dMX2XS@L60Sg=9[YWG43jaH>.m.0By%o@ YIˣI2aN ?˹0;5ES.o vRNK紎1A){0 j^lW p) ~BޠVOB?i]qJœQhx5 3evRllNljs}:sl&}QN^2{*$id61 p. ,-[K\O_u:%Z滬ZT(+Jӫޏc﷠B+N_ l,t E{5mAŜB d9{[&WhꞇÚu瞹[M&E8W\H¾pHޔDEab֝-v<%sSvoұ)Vl޺@r4~TSJCT3BX,yl]*%4664 {Ȍ ً _lkD%#Ir8!梳HT };wΩ/qS,boPs aqkox6:!* K il\#F?6%0 T-Db6j!EI_-RL19$C"{ .p"֝-CW.l7)ٛ<.ŭ]5el!YĢ9{Ju e10HQR5ikG]ۭ]\]ܘxg_ұ) k0awTgV{m|y.Acv~d3 ՏsըjDꋋ?{$l1dhPbE67T*!ϡzjs>//DR5 ]|pE:j 8l %d˞4J N {ZD m2:0\TnIf8-r1[.֝{R\Xӯd=9[Z.1Ylr2Uڐp+[VUڋE{Ԇ5NFl45МLWXwSm-75,h8=ݤCǝ_o{~yP+`|q~Hty=6ϧ vo ˣ:AVjףq繾.P{GT9Mvy5fX5N"V߮^:Ν q!VJYI봼c#]F'Z7K|~RA=% V"$EmDTF:BkK!/`h炌e'=van\6,ĆojOKr>#w%&`!>eb'% ؒK y^:U7J=$k\y躖X]4KwS`-( yxolVE*bև\^ c D%/xi/ [Ϟz 1@SEE(SQpY儵XC:e^U2PB~h.M˔7L[DHˣfmHKٻhc`Fvu¸)6*ZBZD^- l=h}>5ӡZJ.OGٚhHydd[):/˭6qtfuӼt8_Z]&ޗ6ԏ9V\}|HWY|@MNh)j$dm坡Aw'io4]o%WE;K({s7O Hpjmx;f7Ӿj`\nDU6K(-PBQzH"pvu!?{ªCyYwiYlXj.h?)r>8{BSRJ 1*/dP _+@JQ%O>fKy%he n]'ra߅,I;)8:M %Hk1b6,vYKo^nPb܈Šb¹D0qsQ\wQ4ɺEsmj5" t yvT:3927Z F9=P.oR[!ӷ'C.z~LA|>P_0S bxz(/ V |la HPMOjd$0DFv?a+Q)#Q['\R*(hw&Q|" l]jC6ze}d2}tNhHLB9W[NC5%9eQ"1VHRmJBSg^8$F$uiRlɉꕥWKjymO-Grg }/ߊ[0y($s)YJ D*Hf99앿r:'`TS\ X5q9K'i˜8PIA6#:o 7iuW_y^DBMĚIj׃;ֳA$գZ-1q:Rh/ %aGGhe!#+g/y"pLǗnS+%I4AQdLr {2&k!Wx/N pDX#cŌ13&ڇrS}Ď\P '4X8* z$$f齨̠WwΉNdyV1W= `OlD0=$tvi\-w8gòl'>u{s[=GG5hy/u8/Fӏ`N~JAݺF. d7UBIܡY/Uj.%8˷P2b]B)>sSg VCW#-v-~"/M6~J< 9&|u<`A)\φfךpWvʼ}udFBL>>(ŏ?=P& !4K~y7EHon/C0G>ѓ7`PLWq*ٍtX'eCH.ftj<asCI2~F }ODIx<{)P؟7lb=[Ex5zWRvc;Ӳݭ/2r8$F^^qGǟ?>j>%{b>`Ȯr4RƫGsusV;~?kC< &!z ׷lc$A`1LT}=Fۜ Pjٟ'Rj(1cj0g[4^|9?&\~A塺^&=< =S:$Uz'S4*j"Mi laD<;UJnc+.$1{tKU)RS%}Nh\pSZP4lg)zjcbY?#f)DJM7`\)ؘ eNL*(wj"=YZ Ɉҵ7Gq;.dҢSJJ!.M>;=Q1_P$i`٩n|v.Wr˜ԛTXc2o|TCKJ讃`  ~Vr=#fPNγUٴ Y*IHw.F?BE7oj?(#2*s8_r>e5責ggNMA}:pIj"!I 590]G^x8b~:ye0mVQO04Nrlcm/nokQUe^ty57fQ2Ē5sZM!0D.Hz`umylxJ>.R=f7 5CO,qГl<mtt\IT)B> 43V[{~mz`31*vRv7̈́T0X朒NA9V̵]RU^"SP‘(4GVGYRS>~k%;wT,=/u}b4'o`ظW9$a$wX^BBcP?E")&QoWVG#j/&w1&p&@ZoJ]jBi#[^ )`ްJn !roVvdU G/K^$wtݾcZaNA+ZW#.86TKx[d0b27|T;{jѕ@;dCI~ObW2'B+ٖ@^pY(lMMȠ!yP %Ԣ}(JZML .N1B͗9'E^iK(tj7Kyp@q_bW#p#Vt *r%̩7hı ''xB-v%'x='?Ϫ]=)Uib,~7?7̀߼3p]HP|+Tj.Y;U*dNY&5MSkb)!-39]OnRK tZ9C9`(944PϥU,\:Hgi ~Y5K}ȼ@W9l=;R,]Z*җb싛@ɣzv=}'c)Jkmb5(T\-sr8a6EC G%[}/70BY s+"3 ̞lٞlEn^Vʦ4B=[l(7,=I{G> zJkT~ɧ Ϸ\ )<9/CIuf"<ڮHa"¡G˝.ꬍM#* ޛL ">"ycyo[\g ) O[k *A]Q#8}JgF[#]FY&g7x*#s1)mM[3$ ~9/a/ҥL=˰"}8~S|xϝw^D:/2)D"g5HJ.B +((sš'8x1Bl!^do]+{eAp#"ge Yb@.'?\8˵.W L(Y3d."K$hj4ˉ7=*>K j[Nٚ8R)9/F|?sT4C䃴*4;r H.ZEa͵jS 2&g*<#V)'RⰔ>NEM#^GݵENw 0prUZ|ha|bb3*dT鬑ɧ%\]Y0ZيB!Sfŭ2R)^(yMPn׮(+^YPP]>Lpґ5-Ym_V42gUc4JgZa<-rfYU KF9E:^"X!$Ȭd^hj@NTc/Kd9ӂ<&SΔLı1}).*͸W:jwoDY''uLb)OLb-1[L :kuȶ"zLf-7(ddgԙF5wgwRiF>좓tG ~2z pBTr{A>tFr8(lhoFaig6b}m 2Q@^.gHNZ=|KژoO&Jn/#eFʍq.p.b @].)e_!3g)QRfx*;JrF5wMPQ؞5,!}; Elӡg;x6ޓ-9nk+~unЍ}*?8yHUnKHVZRSe:HmDJPjMlD88sfqMşO?GQ?][uMo1/֫.9x~rw/4eeWDZp2:$tZL؄Q@[v&im+iWL@ɤ:yRk3ԺKUԦrNvw.#[V v'-MȮxn{OemHo4ˈGb|t)*n}? w ^/G^/d@z)Tfws|NaJ'(`=UjQ}/f]A>,_ۼ_+1|M1&yFCbC[eK CI>Fre+e$͚%8))oWR~2FJ(%`RV*Ryo!j.Go +κv]7?JaM)#Spp\JjNH+Ǫn+ίOs72g}7,=Z{C@b1Pcg*Ā GX34L%E$V͆?ΈT=3J_l2R6V,PTAJG$Cap |l8Z]v)kqFL^|ҙxS+r,k)m9zvζI_ԤY۠!A Ar\ ˲5E_\5[ֺw;Of(6 fΡRZ喻v4[W+ˬ9y"hq|OœawNH^fwwFw39YNX?.w : ~RB0?gm',Xyq˜(+3n/!Ay)ރLtJc9[]+2!YߟB}جfŖ" gY;nNg$}/Lb'Jֲ(e i>My~F׿SY>yXdGhmbw>n4 QzE$5dR-H(m",qؿDx ~;zz%;Vx$/!)QA?^( Nʐ:~XEޏ?>䩨ߴ}QsXLi0mxB"T`dHQO2Vs7fo LAg'0Lm;m•/eήqq1]p "aT, ux=ƫߺZPXh/'!c>Zk*%q3W$yG;wXsB'kc=wkY3v+%YG^ }!H[mY>v>Ac:;c']ZfC,Jz>[cmV$ =UgEH8ςeB!rC̈'^ң29w}7]ĽqL %]4\йh!O1M^v;[łDJPod)H8pŚ+HaJ*TOݺ$T&D<@ cc" /]^ݽY n\e@.LQ JӲCBBp .17PØ1OCAhȩG{,;RQPZ>3S@6߻đuޓbF|4iHE3˾F$Z1-lF\e> ?0փJb,#ZuX-6*30c_n,tywB( &rJAvJyvIܭ "ꍜv-$_7{IkhDvfJ,7u ¦1PfNV2̵m%B!k;vk9>andU!y-3@+j_׷^޶@6~PȐyG¹e啺'C]nh5c2)E|`Q(XyqcО˼y-"iBPp{nNhX,oABpO[= ,*ٮ!K;iKANۖ*֠P Qd>Y~3Kz9cs:cɽ0BLecqz+' D/ѨRukh3s; $,] Yn8{@/VR8 !| Af{:b"Ql7{#'l3{$X<`,[;R@_ zk)`<5woo h8XYo]f!8N0e( zOY`I2jHďy,]!yԘJ"=1;YbeX 0iػ*Z L&#'ғPmVMms8Id8ỴǎXbnkT 'Zo5o; uW*P>k{"HT;9+ O޵XYOrh\|JIGc9u0B%%HN,LP;~hɍlc- JQqm'NmwR-&ȟTB=qn(@( AI|WCρ<8٘=B&G(Xf<_0^2JSo}8a >us^h zlm6n1ѱ>Ҕ'P)U":& KH 4&qQhYcQXqČ h߳ [{Y"2CG@"1J^E,Ò޿1=TØJ m2=ś~sf-+{rnJbOӱ ҰAl/62IIHZXt6lb-lDet0=.I6S JHf|)e2 x_;RX,(Jx<Ť*Z\ Y}* un ,gelҧurh, ƥJI%,*eܙz\{lV4SNtàV³k$i{ap0ɶ7# A ~H`x8Ƅ8NUhYcCZ6>gRdJxۘ6"m]!S̮ֈBr3#tő@Ʈ;o_sSqsƵ0 mhDV9 K]IE(nvw%"U|Rt2aИm2Y3bVAӍ|8gdܞxJ4ubh  $҈EY _8%- "W+`>mYX2vCXZIL(R1SnajW< H֍%%N{p'Syx^U ޲ƅy\!0ZXm9dFaOK]fRVMRӪermS~91ݜEf}wǪ(%d !CY_ESE"g7"IhCh|.Q':EBf7]!cEU"bٽK?Uh]7\ Ox#y:Ѳ1F[ָLTb~:%vU$5ɀ5MRTvHSRZ 5Rh Kb7|򙼵GUm\p+qФ ZL~%[R+J!Q@,Й i:Zn/_%TTը`v"ȌsYu (Yg()GGc)SV&:]X7Yn pu;$:s[/ZRl$sG;$\--k "Qܑ815߿Q* ;(mY9Pg>ac]59оH{Q G2qx`銛VJ*:,bsT2 A!gλȝaO#j>;&&WQxsZ k(}ƬOH6|D/=ާ4[,:yl|Jkno,thdTѲUr{<./ka2:|H3=CSaFg`d]k1"md|Txݥ/ѐ{!^+j; 7ɦm/R,!_c̡yOëq*-roY377oaXxOخ2ЇN˽nFyoۘJVuԨ@Z?BgQ[fu%1N<ʨ>e+(թg,N/Q֡QRyWXNllQ)`tA *y9&w4hఌ gF޴ցyXlx+QE KpA *Qm!`,;yYm4!(d׍q/pj6)Ư(q7i kZm9)Z& >,*q vV ˹A*u?j 6yߋrӗ/?Lc?֓V7;L$1z6`~B`Nci.&e8Oz5q8Dq|vR-6 $ӧb'Eٝ/+juK4DRkl߇Ã,O-0'?kۤT$/u ^O?rdi[qhaŚ1_vfp c;O4aZ[-qwB&4XDuq BpUc#8͂<^-W6`kͤg0x`'yꎇגtDW f 0Xf-1癇34Q㮛N$!-6CYЌuϏuA!cw)^`)^ BiQ2Y\aqv!ZLG*6g5w, '&Q/snua.Jewj`DCHw Y)J\v>E}l"7[] u[ʝ$ѝաby.B _sS+y} .:X, ,9诓-c㘲,x) *EWdl{X@-HOQꢆ(vtgaV<ӛ?uHJN;ʫ;iÇFh,9?'Ô0wApo'Su'Ĉ`п.t*gQ;9ggEYA\f$C XV”I}kx\‹QWM1Jd1Lga(mzc@z|wY6ze;Yy>Ϧۊ?T BbIc>MAwttcA邧nӒ](o݋lhg8Rl϶nEj/4b?NA1e̊T)qBG6 툟_ HÓtĜ^mmXێ#4R t5keud?Hu}}L+ tƁF(0Րg~ÀƎzX qiPt)#%WtI8rʕ؟z1NI'A?nx*l2;uC q ]$~\̡?`N`\%5٣½f~ҧb0kRFY`z!CBמcÎ5 !y$cr{Ecc(ԋ4VؚFr REin|Uo;ǎ''ױUju9W];@Ut8 wblc G J㙖F׭e>!((Q]@'Kݰ虭΃h} U&!q,Ǔ}IIzڤeArhFFٺ=w;W:V??pBzw/(eagJp_ 4YѳՂ$BJZT$ rJBMfY4nF!t o><,=BBp)vOGii#FnqƮZ~k^72t\{!@JzE(qm QYkV\L@گXPvיJR$bh".NSʉB+!!sǑQrW,2RRIԛubMFTWRI/5trHWwRO+[f1(]m릯J'Yþ@ ܿ?"_N{J;.5DM]aGR18cȩ޲c2gf,w돍( )/((۔7X/ʞb\wrL^5~AoΊS*sL:5W2ÒxE8;`E1I24q%D C!Y8hY(rfiUX.Je)ir_{Pa $[{(ŭ9F15W1_$ jvb"io%ZwHoYcPZ=BҌڅ\O /:Ct0pa[FÆ.eD?c .%:V !mαϸB Q n'? vDn ~y}.'YMNSOlwXhS\TVNhz=SF/+eZnZ't#] N*.AŅ{y$H~/:WKƎU1awFHV"XźlzV؇Z1Q)$X(5!k#(iYo ΋E0fE'Y_GՄܺPtPdۯd?E˥qn`$%}u/Ie\;n !R`Lߴ”ŒDD=s`^>pƔHٜXa\H/px^8fi=*02@R B0A$^7M (Ulug g{BeܴbǾ[PI\JR Yg+#rǂ%sڜސ*/-sJ B(8TxR\p"~0_.e}1ԑ߬E ,9Ɵҩ;ϊP'Y.,A)V;M%m ˗Mz ixa++!|VEׯIh񼾏6$JՏWeNŦXuNd1O87mAu ݅8g6t~ ]D_!ɩ>2VvLz) *%?d-81fVꝌӛ?DTҠD7~ZoLeDb[ĨrRa1*dBJf[.3IO|&:Ad4)"V@6o(u8*C#1!eG,JPn)ՎK`ex;lY988z d=/HH?/&M`1yOd8ĈK/h B7 -h%SyK$}F" ie&@")Pzy 7fR'}Dξ=Q 9qU%r^hުV^$ iލz=/ɃHJhCuN4N3*O]Njn =kCK)ߝ p#SJ\AB.F͇u$zjM zGh>3s!! 1grKNBLꀴԋR,^nIp %!!!%&$c0|0`Vn?(VQZX01'OXW1X# tlڼ˧!!`jȻR>8y%A$xGX>o筞N!EJ!( dQXDe-k7nJ),8#Y P\oavsyqFs'NA YuL#i c"&e1fA;r]2CxyYx j`w@cIANmI* QH|r\v0 8$ $Q6mLKLE23_ܬ\فҲiG@Ȧi|M7,B/oP)ΦՏ?&Gv?YoX8$Ζz+8U}tVz6tKmʃ~?*Gv1)g`9&;,X2"1| Sz!YOnewPA{8xNp8DeC/Ef$سy;^vG42?i8NG*&OaMҕұ'߾/`$iH>p?*B 쩽ۦk!.}ţO .9/쐑]ٺ9Qkv,mW;Ui=kPŗƸÉ O`(xP^P4_(u1iTTp0(zC:F_x9zFjH @D 4!*%z#0^q] ٳuCjI@jnP:4@dv 7:g `JMg?>/#%,_sQ95/sD /ڙ`d"d8*Xn) A@E'",ĸ)3Ø9_N\J3 GŬ3I96 QUJ}H)K7aDa13)z* EY3KM wo*޵b&ewO<.).1iIE @>KJ(aJbsma&!+0-y,;q,v<J;y,8]1h(.tz2O'p.si.i}|Cp/#B|Dv'w k,op(sx-X0YN"cМGQX2" Vh;U \t吼WW zs gM7JE׍= ׬7} 7Х8B _ǔ\I-]ߧ4CӜ!,P ӥ (36ajhTޱEue F5]NЫ{ֈ=KPP5L/Y54&iJX;P 묵a`#Rg׿޲к:vδ`:güP ڎ59A 9 @ Γ֌%TCcۍ}Ymp Mrq^6X;Hh\HKUq|[B*oqqTu(7V\ yHJ iYDmNJqwdIcesml^NޏJÃT! j5һ9&d[D%G3to^N)a]k3JSLIA(i5"ĵbTCc{Ó6"'@DI4&l0Ժ+ɮڠ4frx~&5Y=x$)3uXj̻U4l V;SCj}pAXgc^`ds$?27$΅|*r(Ye~fDu?[|T*t>޹^|}{iJLxb{{v( ] X5FH%oNd5洤宪'؏qĻb.Oso.z`'[{|bw~M$Y9/L/\%˼0O\Ӝ\%tt8rӍ %'L!K>4&~ɍ Ƅ_ Œ:x"7FwӨ $W+e(6ys<(O":R dZqd[B%/?8Z F~,0;o<$و$~[ dS^ZV(~^p(B䬵YS<@8,?@z SA'Pu_63|;pCd18#>[ 5ͥrgPd# i,Ua JP 36I m@:YBuwPgh&EsmP J#"7@<ȡov^L& $@JWj"O[hN+X5%~=i3vhOͧQ-\?ERnmcsZ7:ʯRfP]hN'Mn#É16p9+0biDG*㚫ZaEC)OG^,T~v;8Hy RuGvt0|6u_ ?R!{RY,5&É xD@K82xt0%ը SI$%f 'uz*W7xmxϷ (Qe宇kW=4/Yn`=[Y߾\H ;z>EP1r܊^qOvf[}$طn"#$?D $ 3ZlCd {Va-DQZ짿G;L7r_U}//'+NiVɻ?y>UHl^gqG`sW1$}G~c<[.O) ]n/~L#)s6iyg S4s1 =A`L)Ekn\?4"$CO]uv{fZ>N5h Ïуz"A_rFa_;?K3ISp꒣+`L5m"Ckk6]::mQHD#lxvxpݧ/JҷUEǯIc;EV `x)gV5|T@)&  >YKTCc3wC^17xkFuuUK4;h.X8(K:͉,<E>8.*+ GsAö E}ۂsHG.o3XlC]mpIoI{)xX1WV%)s!䃷iqT+q}wkpk" N_\H jGHZ|V6s}6üz&D 8FvvW[%hEn,{4f Z_^[{9jIξ_3+r7yW&A{LWVo.G]\(.'w@9|\~Z,@@{H/E8hi]T#iի;)%;Ñ@LO55}BAzɝCWi$RZMt0=|%~E>U}JVWF<7}(Fh%XuQ\go챼xO v`k| &iao:L< g{#14Q]=Z2 u_)wßst(7Yd8ːtJ94Xda.*`"GbSTC1<~x3,*&n0_{О'Yo^9+Dn6Ji ԐG_V|sg }~? zU1` twiޠe]+ ]5E7u=q5 %6E54E!x0WE֖}S\kQ{H2Dnl{S"7o۴7ό=m?3Twe,aUGYSkسMr=,ʿrdFN rxs5|oad "6 pcdY#? <jkS4gޕ57r#ˎcZEa;1^YoP$ dI}%(" HlU@"/_zc(VR.{4x Zْ䍭l-L93Q$-s:;}89~C -ϔVmea`?'_h 5b,(lOܼ(x˫˵'a` B` d0NycXb&ٷV8,ٚXS.4;L~ud5w;wڣWwW__> IcHv/T~Efp )?;ƳKY{ӽA5]ks1)5]TLWKNH -2_ަ!E5?GnWMnYyΰŹ (`'k G3&J눮Nv-Yqzl4-ShpOyY?.K 2~TN]ͼ4MUђw( „.B 33_sψRkӛӳ#3@2h\FޔF_NZ`%5;,`*3di $IV)5= kܞ(&򉩟(g */nl2;Fzn;J 3n7T~[`GD$:+Of\/ *!Yu1t<_iM~FxKyd>w~MɢUE-Κ4>Jg_$[kRMxS'_7!RRPrh\HP\U*WyO.cH! $ ޅɔa/YQkunD[Aɼswz>!H0*nO^,W,nFtbCn!$ FVbnTyS]`@pc} < r]D WʠI.Oz_U$~{z'su\nZ).ZTj$y: 4:W̪eI%iMe{U&Vqx-Xx6oR&(4ބjm+J$[{Cg)-T.f2 .f򭷧yVO2M!y'K[zL)6Un){/E5 >EF&6W4݉ 16~X崍L;b7+:.:J(JC;(^쿤׹p7֏zﶚWl\f,_B ~UVfґ=sAsTAg#Dx=} ኢesY]ZuI1@cfLFjfwXJE`2a_]|=_cx!Aughr~JK!|#$O5a퟿1{G'.I-v+>~70E^NXM68n[TP_1w`. ,(˰[bi<xҢN+ѿ}<ǔ`V7Ӓ@X?Lk֓u^Xr&fלT׼/|b]=ǎ brBnd `4nJjM]ݔiS؇YJBV6tpREa/Nk9K@}}k*w%V[[/ flSU~l)H 4IQeҰŨ.O'L˜v`Г*[_kBVYnU"It-7)ȊN{jboy:st!k'ah$zyq?T6ia$IIa$[˜:RE { ؔ/ ҹ&q陬|]# G'FyQ? ;x1s' ص1j2<尊:vb@E%1'_3${gw}EOGr݇d<9Է>wnΙ;29)ok =T)Od1Uo*{d3ghf Fk_`8T~p4 X>,h-M̎t&7^Ozdk}}A&ܖ~蚹YL)b9<$`=puE*F$z?3pndЏX-6xctAaq˶+`OrS 9Ys?JGL9{>a;yH/”fL1MD-c$ ~wRRȿ,3ﺫ*}%5J-HL1^εRB6oG[1/7 ̹&=VpN|5/3ŊK&}-"YIvv)6]M]hGNbvdR(3䔟Q^@ʿh i49=;m1Y'%@c'ҩTR](涅PDhOU`o9+!"En=gNky KT}2ڜhOhoe5pۭK! tuq~pxpt@PQW5kxAuP Ke ߇~/Me/6 "At 8/9D cKKRۗ*w:ĵ)K^"UFTj䣧%\ 3 ǔXs$ZT~ܶ [08 bZb$Cna{T!RQk?O,t+\OALwQb ;vߎa5ѮG&a8{ɻ&ۨVezv?2^  '?#s|0D4|Zt2jAmU(?y[mܦ+AOaal)*$sJ2i &WHv+OaӀJzP-j= ꀘYq0# +PF9p'!N)b!\&=lqK+Y@4[`^ %Q'w IJŰ1p'MХ,a|=G@1'\.ZӓS[MJ Uo*i`>GCPK2-)0sV)J$æ:؛4Pm lxKSFIj.m5.(1F^(l [@9]o9f.&|=94H\(EMhhL ,Uw9'XOu^B/k-?뤴fH.U ET Ų1ٙ m쵲§LLj d6x.D7e15i2-a#qki%5r,Fwh 7ZJ]qTӲ\ߔT/m޶E1R5;ǪE;al2R3Dk1W\5}Kʀ3;dʶOAogx ,rorBnd&psϹ"23kAsh4Vy>7Xxk|޺[lSeI mҼ;hR >CfI~qU I! ,}̝#]Ux'2s9j\cmdV7r WD(!DvHqIKh-=.,53'NěxxLŽsuq}2Pd#uDf oN>̿03qr(\N2 ` 4ӢL}e4^ d)KzYhh\L-vx7`Ϝ+BVM ڂ(&$ψ9|Vٻ8rW;)(ŖvkN2m.;家LjArUH>Ѫ̵qvk,(=:"k8<[1V' 1b d9&ȓBsZ wv}Sd AA-UՒE]wK12ohHQ{ |HySvƾgt?5w2UBuzh} ~J҃#_qSP̊:tgc#FR0A&|| @g)-W$:q},1+?8wzkHz .u6xLc`0qڡmdw^o#.6KMFABݻOﶥG}; .U LԇS2?vyix^d e@sz(f` f_6Wl:H>:y}6cߒi񙪳l~Xx,y=,e'7|ձLl#omzͫ鿾!Py$<ڼ,=O&Ǚ.JtFJ3Si_"=NRw^jt;l@X6^*`ɷ F 8;l-|cMMyO1) %Q=渞 3rJ.Sy9;]6M=(A&]?9KP{Kp9o̰QӞ/ϕGljxJnAz~bf@$BQF̮g;X~H.%zKT!:gYٯ_ϩF/Ce:_8|ꦟؿ@;tXZ%oVx<.ӈx\bq$f26/c9 d58|SV2ա (:yߋ17E(FZ7W?:SNz^lϞa~wN^슟Ϯ86K82`kYS3XL0_?>t ᡘ[itg;?_8~>wldӂWtˆ G<(y=ybX͞ȘeaA;c@BF ^6onی!`ؾq&p`|e^"FsGlal `,ѓbvl%\0k~]o]bBV/wNy{.zy~ɟwZ *F 2Tmq 6rWXlvv?[2hI1jX\Y!P]TͪA`C/-L.Og# UsOF!!6yky2bU;OfnsxXWcy~ɟXAa#!v =0Xb+]j҈/ܩ(3޾{Uo+K2p^]+Gt nW'+d^R:"!bI^^A` ]E=-,%:#uP`s]@7Gye5ʫXf\~Ex)>RaN&FM; H%\lMN3iJ(a:͋g>|gG=)97 u}塟!o7Q'OF O}:e!IX27>b*)gjF3(cEfsuNӋaT36LJ@/t-m8~f6G=.qN9֞lzsI):~j}u?<=PBdzg0BkEoB)ԃ8\8c!o~'QɮpnMF*Bn`!e]8{[tv~]E'f\/? ĬN-Y"0,O{':q,{[crDW-5pjuޅ8&̑ANr!K?$fzgkk!v~ɟ&D׊ 6Mpގ=FpF&̑, sD%v96"0Zs^9d–Lؘ}#ڶo= 22=G#6uK ’A9Dv5TNLV&+/:}ٴJֶD-p L UcG@3A_s*ؚ *#N!$UZggqE%b2eز-B.,碃3C tt X`NB@00ٟ7~9I`TXo$|{?,gSіHe6*y-N/2>Bl̲\s/lK`ܵV4Y{UQ+7~C9pF#R'Z x~ ;"<I. x xoРMxؼ΀kLkkD˻G;;SRʶ;j0;RuRްX9ɯgcS"˦&@6% slZXTYPbF ^; {g oE#HKP?5{ JS_PXbz`ZɸI2eaթcm4y2r#bac\PsL;gȔA~&~_Yo;:4<[Sk5`a3%,dpԭUbD 1eR )c?S 3ĞӔINDcps${ӓ6_qI'T}ܖ'z1B(FqI*CF&1XMP| #[H).ub5F-i!@-j(緵# B86k-,Ul1Y˝wld__|::|˱D$=f7^-n|8g Eȋ'iϖ=[R-w>"XtstȆ[//m_;/zgfz ?}`cاb퓾σʋx$}}G5ÂkE/.ƶqPzjf̅Dp{jmv۽?G _}Rckٱw :~N9!1s+N"ʔvD$sƝ:Ka R)Hu ՙԐ)hZ1oBMO+#|!:kz1Yz|2BΦͨ z,S%q*I0m+ 5UL7r25TI6VF|#\bIbZf)h@ LPRPS%},1ijp/7c٤ZًM ,%N&XN.gZ pP"(R .AOVbs" n;v>@w2_V6^aa\ +-F6u(&]>uul6g)č3u$BEf9&jeYnghG|uU2-=WPl>YX%eQ+D١ 84Y]-bbODqEBN`7Af>|wwpiK}AG 9Yy=gg9YUifJ%KZ${Yd ,8k`{Fų-e-jLqąb{8Y# {y#@ў}:~IlV+b%p"H&tBi0 )Y"egh8Ԥ!eq +(,Ygٺit"$Z}Qgm%EN,4JЋ7%7@'&]m"Kԏ &Ňox--pC?+)܌,ĩZ2mU5ϝYŁvo F7VRw塞?i f<[1 J%n Ro5;J%B[?>>yꝩ_L5%S]yƽM|6ֿڨ|(g?yt7SY7Y ,c۝Ǿo{q"6*廫J Y2^  /\lT_4 bG.Z1kTGV&E<;@V98%`Eu:w}@~|ˏ{FMGy6*.cW:_2W_<ӝϮ~ n}ϋEΜC% k-5QwZ4~bCM8xb7* F+wW]oaP̋xR~o%vY>Ma}PrPS&cH%m(LB,^NS`y1v^~{yZq%O$yY18w^!<[{Ou!T3SH/)O!Yh-)vQr9Db{(94ӝ*L}rN//y:tfg`wz%=M4mʽnӪ:ww]T MS흁r ݡt%=t`se:qPC&eSzWl%O6d}RϪuJ{N{GSyW(Ӝ-7rH8Ul5a݆D L4Af)>Ͼ)mzE_k,9r/W'+<@/}eC8ٵ kΓRdd8tH!t۟F6NaSNêh7BbmaT'ѹhYDNaφbnk(Ǔ,(Y$X{ NnLIN=rlhgarXGge~_s|xSqzu 44O]nF4҈CcG.7kпcQ`:/cZ~ai*ZuVoE+<'Z9ò~V[22ֲSM0\@\wJ Ua@5 UVSb{Jb`mFF>Ĵ;m{8_ّe Fb+G+/UA?sl3G;s9emwgk ZE7{];䩁6ίM P B/v H&`0POp)7Yu?!ڤj};+ub)N Ar#e{"@Iݧ3#aXǨ!T]WI}fϙ U5~yߚ,eyl9$vޘx&U[@F=,4[9m.0W<7{:POf*y(M#א趮Tp+(5U!6!6Tsc7b{%@D>*lzH2ְ\ʖW}2NkZ?NL?@/gXSxmL{om$:f}9OZ"[ˣuꡌ|3\k{IΊkmX o畲Me+2j飧A]N͎ ;`4".pJ@κMWSP/i#w2|^1% o`C'YlZ^9Vb\+( ]pIOۚ\^;7乩ҿzE k[}ɳl\];ytEk&&`cN=t]V.sU;w-ˊu`r0Ş7S_PI1kUvm6*֚@=b8Ɯ@v0r'0JdjyˣeJźՉ*ᣊ]c!9ⳎXNTHrė%~Mɛr2A;J]"f]/qBC($+#@RR@ *0Q[mu( ]6/V$mѼ*˱fNe4scMXE@W4,SQ_ ~1s_b,sJQ8m7e SH̜Bb=d\9cD a zTHJ)4"H)ExppID8PHCZ2` #3^(ʱZ]'RXmz?'3]%O'. _dH7&ċWsYn6>@gP ߩbic`|28U+إ ^cJe7SCZ2ޕQW F_Wo!gPF2kV_tJI#.'_ `%g2sk\AGm8r1Cbx0|Ϩr+G暛r $&(kЦ0a9TpiT_BT̳_.3UUv`]nuN%}}wfK`1$wSD=)p-b N/ K܀j =VzלRLxeG 9`ҠUŞ @[+ I*|ogہ?ā-YI>}cطj kP[E"d(و>#dizb׏-Es;KS[&m \ y lmamj~?gx_6ޤ RǕenSM5XIm ȁrV¡Z7uĕ2}2ӌ<0"yBn`W}L@wY!SK`kRwԳx .y29Ӟ"&gޡZ{rݖ]|m]7uڭMy)iIM9?f 2JmΊ`=5SFGݧ*{jóF- j\btj9dϡ kž0e.x/ޚoތr6G]i;SBX !- uвApG82b *ϐK|.[ FץDlQtNNȟ^f ˬl|ϥfq%g;J Xtr6ωON)Xb-si R҂|hC_%~fޫN67ϥbKcgl|y=] =A橏i8)>A;k]e$Bϟwwg>?flRb^Լ"QwxHpup"EmMhsfLyYhÚ]-6S/{nY D*l^9LRr¿͌ۚЂ@1(H+Uշ!Zq-WOt4=Vara5XPtEWf>|";hڌ$XٲzOLLfO$9<&v%}`~]7% 'fuLMk hѭz'a+rj*;8M{jrs B |;3:0A V)V)[Hi]m@*+w׳C% i*y I(nkkV,ʲT7hn^3f 7ò|?9G^'h#活sDl-nnuwʒ*d=vA, bd}Әz .@J)g4K%H1%a٣z`6ڶާY apyJZ6\ !/|D>ǎF?;箢j=m#<LyE;KѨEae6ؚY/*`^;@<g+?͐Vv̑&6|x+ȑ9M@raFi΄}Ǭճf(qr_O8<}7r=?-shoo. ws4xo#[s+fz c}EzLn[AnGigCpdn t>ijȋ gY[m <t-ȑ9o7'dK_2[&/c&jLo$qz| %iij^ 99:sf.T, I,rο7!3zv<˯LXl8@h@{LTP=(B_>@*@*Rou! q&͠uXHG[/D}[^Y7Q7M77.xt,ЂRf,跙F@rԸ Q&Â9e Yyc٨Ǝ< [vڰ;WbnR *Rb~5r4}URMZO?s5b%NV]s~N={쬨~t{a<]ߧ8ed“x<}S̞o?;^U՛|Uhcz9z'ۯo[Ň6^caz y8Ya5Cyg vzGOqC = HGhV$hxG =o^P&GvU贃=sR'_yc[jVˆ)󴐺GDjyyc@zT@'A^JE*4sư=kH_ ɾY21&.v5A;7Nk?jl@m7}1%@;drk̑H9vYHYlh7n`iܡ]_թؼݘͳK*3}H#t)V6וdOOO˿O/{ }(ViM%ة*MS\dU_NjblBX / kݥj ZkM'D[r frs*ș uuFx; O' ZZ6c` o6AtY֥RtKzj(s$"V(BMD%V h |JoE[!5~_O_N5Ȗo("LTJd)X"bߒ] W &W2ձ 4 Ŭ}T"(W)8+qbR7N#RI\sΜ`ltԜ) V0y z9~3zR@}Q] "SsRI@JLe7 iXȪwPX ȶ[Paxtr2HzJloUuk_Rk-UV46F*& & :E͊ԜȊ9R|#&3A "5bk}ąV' s$ku8ɖw2(E R̵99Qt1j6et'?5(x}1:WIY'{13RsI7>6HXyb3@SlډF;1&;g*;4$ " ևTb˚Ybsj qÆA#t. 4w3llf2B\iAn8_Kwl7~vaVlgmrz>%[ y1J&T0$Hq=b]ɮ9dWjz 'ڌ7y!Cꗋ/XSN+┐pR zVЁPkF PuGTtL.4heͺ^, {50B ^CpcLK`w޻7S O&u%m 0#Dd,  W1Xj|(NЛ!/ƇqzQ'o*hWz+ȑ9{nP21گ/R'`k:F6KI,&G%AI2C5!fm' [۪ň"v򢞣Ӧa3֣.RI'S=bt- 9 "Qp{XީD>Zq0Im?=kVǯ>~,1ʴN.52{ ^W֫j{q_ IAr!Gߝk 2g9łvWELBJ}bz7Y\:93 7X׊=E9"^Y[sCj:50Ie @Vϒ 3Ae=L oh ~tKw%^4Чܙq,%cH0:;Wv-$_TWRP,$"ZAO:lDƑUGKJHZI%1bXUS:' "g=<ęzOӿ֓0 [U -H3'`݌A&YS "$<ǪgIW~BW8ջf@ǎҺo;${p0ADDDy5围=)ޑVie%>狳ϱ Fh@vṛEԜ*H TU86k-#FRwwٞ*k]?l{Cк/c$r4wn @uR0M+vVNf8p3c9: ]Kw~?*ݑ߇%1A=lw%r d)ςxV6jhz[4]@ǯ21Gu|ƆUa  aA-(`zȓ0"ӞḰx [.eP\=s`'n?5wBQ?܉`1ÑSfdz;q#đ9 ,5uW۲r3;|OV͐΃w<[A v5Z#e_=#+Zd>Vz)ug [M3 OzВں7 f~==?5^]p]͐90{Zon9gP2oqzYB0WⳄ*#_${,;}=2ǝ<+~ݙ^y1TlF 7DmHeiVdu3h׾WK‘[;ߛȜѓ34wo< rBɬHsl^rB3fɤ¹_?G-Yg| .1ݫUorO\X÷,}ϩ;E\LB⤞鲓 O:,0{,{ӯW^?Jx# \==^-߶B)m0Zcˎ Dqh!bLe#}5C1ڳ䉃?&Tv<;`b" _Vho 2oŸ9' hUܡN7ǎYmjHWJ-};b(^QupͰȔ٠IW#k_t]?Жn'h16AogB3fm:SA[Rg]!%XT@z<f=S@K'.ab  8֩8Fz=%nĭ`7AnAֿ0 HJP]uiE݃u+(#sȭ GDOE{7(<SՐDdNFKyZzw%ޕw%*fpf.eOe8O'U?scnՒ7Dab݇[AI8̰O{EGszFOܣlC^ Q 4v#9řQ6AKe90%~HHCmVh8`l_#+8o^s_wN(kM2Ü'R;vFJy36~N0/F 0PR;vwFݾP;qGšR;J<1bh搂E҅)ŷH {.rGTD>azh}/E^+[ }0Ku:܌?fU^ Df9WϙOҴUιŜ1ܗ2u&YtV2rkSj۽&[|:lʦjWOubLBB䣹|:w^V]/roڂbMƿϔ#e.`Q̗PTY+]hiH9S>K-2 h<)ksWm,Zoa{75sh՗~uemp3+F&@2qs/(w2+D*#qy6ui”!4,4aWI+.2whVT[h9zOQECʲqun!ѕ~+1v}l#1WgwyC3F+Sq <ok7*‚N`٤1=+S^1/=Zʃ,ՕLKiQAufEtQ)3pEväJ=3ZkiP^$ߩ2Ƃ4?b[s&p贌62қ^;R~j^۟u5TQp4mu5{sfp]c]Ք gAxW |gf.yim8]ۓu8r]ژk:IS`A`pDn+~>E"{kg.,%=;vg˳ Zx+EB J|tiFف1Fs=ƞQGK4 N-f aoˎ-P\kt+̿R=Oyo˰XeZ.[Sok[? ۭS2lV2lTr~N+@g;0j};X`*;(Xɇ6WY{ڢԚ Ԛ|35dׂ֪[5!뱧b'2w<'n9 K'fi%5 {\yͩk~S~Q|?ף:uQs}y>ѝX 'щSУICN^m7Jm͎8&٩ډuGr)pFTgmzgU;GvdiUE $!%ia0R/QBz|t89tP~'/]l;kP̅i}ꀢј6DR[e u C28eR{]Pz,UKāmY8RP,u Θr-d= }? ;|# $J=x;Ê5bǐ[" !!jM}QёQ0=XŃq= W ߼ԡ+q#Q)9ףРe*UKBv/K u`Ι(G| jfevfZvp%cÚ=kYmNٗJLtz}V5ׁk.x6 b ONOFB5cdhiƪ暁9<s@^E^|#zNЩcy昨]Z )[|1rg!QS-:)f)F)VMeՑ-X;6ֳ.VbAՑ~|#k]j:iq:0!Fc5ʣ%t ){Yj'nXE(<c!Ujn! +;pj'n DG@0}JS vGG턫 E-GSjoMU}9kQ$ZjI4%[ SzcvV YoB/`eVjn1Y@vBhr 詇9K1FJ-7ꜹUjw΢7ڑ8 FPjn 9CS{;=]?\]wq;o@o-חgHIwҸ,fzK/}'ãFWFM~Q62ɜ2ꉧZU#1j֘b n.ĒDS?&ppyu2M?'kl:'ů[ӯ4&Ϫ˔l>f=sB> ZMi; A莃_of( U4w@r34*^!f* Y١K91I3tvmEO>Ril!R0J^͹c*KL͟?<ȁ'=Z{ U\L%0y Y y,r0sEMli4Njts'RW:l4q>|sk~ hppO}HQ] cO 6SvϺ/9o>;KI|ZXGk˪o>]MSiE-|2SZ&ˡbkgD9vxIky=.e(]\sBi:oʦ(! x,94uQWk=A2 CWL)rLISM=H2*gUK/Fxma.,"9M,_?ǫng-4az̬UĢT}gEu0E+G,`ULXxﱱMm5AYq6gB\"V[dd5,e~e~VklG5'!qy=iV^/3sZ\ hrmt'% XQeXry餗q jow7#/CPtjTO+mٺq&;6~S[aKY N.8?ہ/kO\'nNj kmpq;oMEnƂSŽ}1)ny^Z64O,x:P}|{8Yч]Lw?;8+Ϯ7ik1F;{^qz/>mxG. [A3; I3S9Ulkr!8ţ):Aϗ°oˏm`]WuD=* /]3T l_lא'jzq>0we2tUV 1Q 񭸋qD%&;bcso-ZJ4xLqOɖ#Y4Y|Q Rv^rqX8_tuߐߑoT~xqA@x9${;" Եibo {\,-B?y9l[%,c M~@B0v?};oZ#3T;U{ ?`V9Cj!h@u}aZES#B7h !۶$G~aL.oX{7bvq1OF>/&/hMqNDŀGu7JA9V RC}C6mgY'ǰk'm#gW? _cG]:)ڣ놐zqgwӗl: LLY*㧳Wyvee{R^}$L6IW}~~?nTN>J8xKcݺ[(8w2Ҋ?~=~ztS=lPK~٢]ۣtMǸh|^X~T?5D'ߣB%KN2& Qdf v:1llgٓK#xCawa_Έ3 ˟>JN%M>޶6i=/FNf|1m#"~۱~1Ev Ht g6Jr3$mQeS")E+K̜s/EQ#n@:#JS{$uJ0S|bH5PEIR%`KGMXSk7nv[z VE=)eLfjztdj?xΆ'`0h_l3juk=L%t\kv A%XϧFa>5 QOk`0t’+ IAr42ŒJ"8,#R3VV%Xj md'ѕmvgRX0gkL>/1=,3gi&N 6 +\ cR4`Ō9z7LOUwu/xyŎ} @"j ~%jЧ8_k-N*6aB~ Of¥ $H9#C|',SyɬKRLIv+$In:-N:huš)k v4f`/r$&#X &IrT2c#W:d;S<S +z1z5OYuQt6*U0By7!X',N ``e0YvpH~o{^En;]#jj3üO eJbLe.:`=:GS"!ho ƙBP[v6O_'E7H8hAȠ8!r} 6+3jo3c_y4\o1c#(B1< ;18нc`k:Vjl-p \KE[TJCX \c}LJ/1CVcQc* ϟv,>10l^Y0/0+]lW:%WcՒWMcHz&z9Nnl #̾%S ۱+F}3oÓe_g|Om1u't97V)t>cr Xsw|Qe%g]-ꭃ ,2Q9-ȯ 9pݞDH!u]DccILewF( dQ9Xl!',ϐc WF"a+mM@MzϜхf@AN/o+Jޕ]N:,3^/A~Sг!}qǮ9MAW M!W~؝CVtPvPtxS5aUNߧeމ 9J5X&0"7Ȩä\~ (pSi;Q5;?_FĔBE  ԰2a: 2BZ&ɓWc d!(&!B3kidġa1pV,w[gM3&+~E;v]cչcլ8)SdxmB$|Iqeb5J5#Lx4+Zeŵ}Ԭ!e3Dq^wbzQ֤b yǽm {3ե#Pi' 1lʍoHggOuLF=bܢ1G3n:,pAKak_ۄ|&lV_.^׈|e (X L;=dя8а&UKHG?~=LE Wcۛ2]^8p缆=?%(=._}?.=lyN~ n_zFE=$=\bU>QZG8ͫr4Xϭbw'}g9$?Z=('Qv8լpP{\axTM 7/(w8+os~G>zxߕ6x|\/]>^8v5Vh - 萜G[xQWӁ `th 7N1+03y0m8kxhAw7H&@s ^rRM=~nbfY՗SÚZ ]5le2j{aJxՂ$ Ta'jALPP-Wa\<1}ypQ^&_~O/ohK`Ys  , bX d}ּM 6nbjwݴoY#T.ϕ9 -zL7CꠉSAiH4MҎ$='(ц!DIr5yE-v5 2twھy#NOr-ڸ6Ca*)O?_eF'q;8DZ׋4M^kTUk Lb23n)@*j-0%Ε48V*:(<*]Eκno.Bʕr1$!]H qie=u\Ӯ|Y-k=o|}b\_ÇgB3w>+Ϗ d:+onc`&C'{t]nzu o2LM%%cpϰ3 ~vZ*s~W9%@O&49HX~բGGRxVײ܀dے=[򦼢dGj R$冏A? [C2$<ҏ]s/)FLn=ڤ`!o/-@մJ|%&܇4~,Fh(Ոc ØB IDFtm,lZِz2oվgz{P.+NnAKK'4BZ$ӠWڜ̹m(ApOśыӗC 8h.0伩m13GJ?'ѥTzJ*"t(/UhU\Y3'Y澬j[~Q^p~8V \Қ<3"c[@Ѽ.R1cdel[~<^ v,05- T -b˯Y_ϳ{^cc2ڧU/j]kB(vmt/EQܵ\/\T9(Uטѥ]14Tе]6y2g9XڬTrh-`LŏYkU+hN2 }AD^X>Rase4q'K@"W\dI,yHS왤 sBqBR9O=s(KWA 7

t#yׁ,f,_sIv|Vt r ]E$7J ͪH cXP)c4耢A9̨  JQ3ae J@#EE6Ɔcd4 t7`XC e)13ZU-QPV e=¹fa[PiwmW7\ȝƣ/v+ix *9EA(-Ry :qLzՀm3qMbE:*nZ(ΤaP8,u]Ѥ-=a5Z$6_߀eb6(Ko5k+ȋ% I VFKshS3|HN ; Q)Ì(щʨ<wZ/Dr: *ے*IpH~Z~oƷGWokv?vP<5•-Z1mWm mŕ1JY Z_s4/1t>@[A.7{˚`UMًVFdsaAzPwH}. A)Vq^HThwo5ovP5$=+pt{ԑ0WADlld +({ȍA_w|_! m Mr[ݒv-҈3~t*UX e.'EqF/|J*ӋI<' F";' ϫ!ScB ;f̛Ϛ-rd->Z>ka̘h:Au+7W8M?^ 뢢(WjS\sԬ9@u5 R&>Q祬*4hR>߃ڒ1RkX /üuw2 HSc>[^OJRk-{ `O7qV }'.5w`9c<`'k`W/K{}[[ˡf=jT>ϲGg^%}q݇MgQvi@!/lg;K;fS5xUWb/o9 WT&:EU8zRDʸ`ߚ _c<'طl)I' شIZ0IYs<(4VXĿ]][S)H꣥⌥-Sּ ƈ-UQdrxJm9#ߑ3r,؝J+s@תR̳ 9#wDǡbJSfW E ޯqA.!Ȟ!U4- +.!%E dj2B/=*Wۣke/I&9 ` BaHl>Zj9|iQ9kȴU^zpvh@)[t m?i&RGQl7\Zgd"wo)e~6w ݞ5uMvݜѴYNɯ\HR'ڭKwf1R9{+b$*'< %Z|IB#W(Ȋ+G>bFycҡ5%SFA++2nGoOݲMWs6ũ T#NV46>q@R9X[HFA)g +T `.r⎧ kVA 6Tp 9혉~0ci)系+\'jŻ B)A7Eq';nJ /z?̋3}/t,՗hzɁˡ)ai$`JC@T9bDsfO 9ߺ_}q'|5cKvy=,Q,Q3i+t!0Мs8g@І+:\}ޛ `B'\g;z}_|a69cF|@HN%kV{D6Zعʔ˜0fw~宾5fAC iÆ Z:mJ9+.BzLcYv!3) Ņ\\mMZZ3ZH 2(7XiۨhVH wMģ}? IuH,ҶLwn1VrKL%"(E|P(fgLx+VXekpK_=&Ɛ/ⱨnNrus\ݜ4w96: (@0nbE!ڀVFR JC# tP%e;к'*qISQ=B9\)r%-(!xDE`TFX)%B AQQQ2A_mE u.SqQj$W$%iVBE< bege qPd+) STob{RQWAžU6{QQQ÷G1'ڵ~\&&B93VѶ]誣ciYIQp6gpF¬N;&Fɠ]g/I>1K41hw7hCp`77=bΫGW:NG_o Kmz~kމ#cmNճ{aioI^i|qgFrpWjp3?UF'W37|<=9A?ܹymc)<窱j8+vcg.L!YQȗѬH IJC1rmYX-Q"5Q}~Q`oƕ,dqrry|NΒZ.Р=$@h{JoP`]=@DYU3%IVJinQ",P. j\tOY-m/9r GGֲ6n򝆀Vi֨*BR:GY6B{2<&ܪҐEHQ}IzM< Tf̈T d3 2 @WrLn]@]Z.}ޭMqc)hvѬYPgup`u, T4 טNF?鼏1|B XݟO5l~3K}'/Rw2<9LfIď%m@.G(lf'6_ ndOz%2g22;&\b^e$4\qsm}t5|+̒AaG.+&CTx`I(I;t{31Mu\Ȣ(fU/>]ݎg`?Ēn kW-Wù|nQbR .w8䉱iQIHßVE{ ~WY"ES[`O6g$a:8 =Cr՞VT}_`iP.H)"j_.ƵucX Muw׾ݵow]u0zݲwəcXRX[ BDNk:>_'Ci:ߐȆ,,Ij\ [T=Sj!Tؒ\:J)Ĕ$*%9WL}V;П_4u ʿ(7'\ww2W9*2G0_`B2 _˴Qiro݌KB+0`bؕ:cTC}2+."HRk)G_ܣ \b62<MKH0ecHT#`T\9Q*mῑ'g_+EϺ'7ߖSm ]qxSn뇙6 ̫Cs=Z~c{_}r񮯥#aD, //\ :-3vygXVot 3 {%RDh߁r+B`Xċ8_( #j"-5pJw ?nO֮yORdu0;/je;lwӟjXď124SĐ Vcn97$+v~ONa;LDQƘos_b8i-QK$wSziC,E$hЛC s͔_[/Y^ j8BX0E9)$jyR6-aݩ줄 QpD"qG +pۡ4>anKwwLwE]% n3|:Ý?s-oFqCs]Ww[Ƕ--eBd;*Ƅ V jiH4@tm}$X  IؼgitK= $}4Ԃ"(gWe_fw5`\q12*ۙN#ݿ6R &[a0,(ޘ}d@*aOk"5<ϕ1BSD^#[>XnG%Մ3E|\(3)ZrJűm7ZGl77+ooW޴\IG,8 blI3zQ|,ObG4% Uc#**KrxpL/+vת>RjBaX%L;KTN8Eh۷k"SCI.8.,1k {uR؟d%3 X1Xӷ0J.2`x(G(\,ZΨsjr;R U LAQsm1o\,ЇaPy)񧖚YanO Ltpk>K|dx *1jU\3Yz+Qz0qk"g*қ"?KXuvHmxdg9 ^8 ^ܝq J3h> {38SR,#Jm+-TUv1Z1>p1R٥ڣu7Q&q3KXAkDIvRֆ 6wD\ϗbWL \sqoO)?FW|t5t bxw[/!6> i(Ipy9 J"ΈwXi .:pN:t |Yl`WMMeYJ+sS8fz9gHkR10#,gup>Mwd:}^k/<-Ϸ;ٽv@6 cb /hU(#ak(?$ۏ #Ssi8֍oU9yfc+\1RRMF @1֬,#Z"[#@JE :*U(ՖevCY\x>z v+0}M`NxždI';u09H[_\9KK.xvmw4bJ'}*m*ub31B^l"*{.`23Ben7k[=`` !6^anVfgӳ~}Q 8Ǟ=0$XT`ĽjIZ˚;Ǝ࢒2wz5Oq`fq?̷% R!Qa:*Tp}Z~d;xbME/*HN&t>{I= ~Jȑ'ZE(bk?EQzf;})N*-g>mfŷ 'Z[못Lnمj7ٱo5_MrY<*l_b|إ9$ܝ˓˻CvC)윯Α,ԋP7)o~rKwh#ѠFFeۤ>c tmYٶA+7W4?"[8m[`Zw6L䜳[vmúѴ^;ghZF zuڭ.)>nV^m lxnx5xf,Dca;ӞV YU|4L :ֆIG\I>DѽGgi%GHٽ\\tt}1tͰdp~$/u$*Tg+<RUP18>H"8 c磂#Q 52u>]Z~QS]Q28G9A8VF8aAwCq,A)tS>8(} ; GtHH"G-9$_H~JTW>{~w  kul:-c˜t ) 2Sh۱G-(Ix87;|&reђ"ؼLHQi"`JrZnv:n z-S=`lolՓ߻47lCAD~w82Faharέ5p,s08\8 ZZXcfK>N-,.snCd8hNZԟ^EJO6(D=# cX|4=5]8u˙{?ÜՍ2up0] IDr }32ä+mk{MkEEqs_,xh c8h Eq )Klٔ(يâp}Μe3}s=v"m-Qhe۝voͅfbdd?ӈS1S|ޤhf\t4 86j0)m$+#FzصGѳ$&;PU18!˾WF0XF oU5Xs+G[[23 oaLQzAN{,,I^Z ؼ*U9)c"> AjǭQ:>7ѭ+v¤~/L? ҀLg:L-W*_ ۨPp_+P}J IK޴{XKۑF呂),1֭^_zݮ#JXk$F& kURRkDҲEV?`vXDZEuLT>u+x/x[ !QD4Kn5mVA|"U@r!Cml&4=gGMn6hc.U_P.`av?B\#^VWHW޲htR E_.'rn{PHv} {%,bg$% Ǖ*K@ewVY%5&DąI>D>[a%ѕ[ĭ+#F$9Mp*ؾjO+ sKsF4ea^Ůdu˅'h]Xe9g^410E15iyv*i\ӱ ?FieCs3V&h|&W]+ {MV7ɺ9ځW'Qo/dȜ:dϾN5?Kam:'ͯEӼz٫CyzYwDtts36TP(}}@cԷ糗/ZN{O]Reox8cR|lBXThuٸF(Y(Sk޿ҕtt>olNM~(\&48c>QZ>mLE8 ݻ+(BI_o  :~tj|q5,1ׯ,)Bc'5ZŽz7E35c8)w}hΈG38jK_%,?<}+, CgUejhE7J=II0@H*Bt@+ #L"9K.?s'ivST!8xad:5E~뜈ԵF<;j2,NC!:Bn+~+՜!8ZIUG#ţ%_s#F!zϯG7t7|.($>db ( O"|LrT x 4y?+Z0=Kj ?n)_O'B|a(<(W"BO** {Nw/3JJׂJ˽Ji6QEgc9i&*mҕQM>싶+R+5(b$l,Q%Ij)v{K1”p]ɕ3%Knָ%+A0n|Ɨl|M|I>%_%%E%c8t@{=$t(ĤrK3%27"7I^NMwXU"~x.~_S{S,#D`"q).+J0\O,wKlci g\ƛK5*U>R4TY&i=FbFhFVG#7$g W0eE [ŚB),V}:F0Fh6xfIT(5i|JPqpK6dKnK/4dK6Z%t/"q3D !h!0O̗TPfDU_2/YE^p#q:е"5@_%7%/Ԭm|Ɨ\kEH•Ą% M, L,XDG#+o%+elN`,*xyIw<]*UX6\C+?`w&d#fȤ s}͠>5an6a}an#`E1xd#06dI, r jIrca|bqϛȊuiģm^qGAX* 72);XZ| 8 )Z^z IN@Mas̞'!I1{R1;}̦Qy`'!WtJ;ƅw/=!oo…ny$F2~zgj1@fs$EM￲ Gs]Kj|cnWu*ʻ={bl31 |-4!tC y0+A\u: sNsH8U(vړŘ*(/k%B#F&xbc[ÜĹZ 7gd+v%,?~ydLsvߗvq.qvMV $ƨ8!ŞW܉A;H!=s[;e0R$?:H)mB'4J$A6tIjٴ@pC۳ (O"ezCH9rB-bU a{3cp,yL0I%  Ob!ZY{!(7`I%"g"a%B#1Xz<'DASslK̴N!`r-Lo7_G2_Ź3K`DnS[ 4^`QTY<,g|tV6'h.[0@QbcFF[h~L_Ez*VT GKyaz6a Ź79szIE4ңH[#jb-`_{o8SMąǎ9 JB8bE`C=xƃ>:A'JwvK}|نVe65dHp,gL⓰^O \sMݾӻ[y?юg޾|u#Rҳ;k?rpM1]Z l_G0I봁U}f4p(۔/`͟n I$9$`Uw^x;)QZ =͖AL%qG8JcʼnL$.AԂ1-X$|c%T8K&B nr-p*&!s(\B5k$>NC b܀FZzo +%쯀al Il0JAK-@eI H$Lh/r-Є ubLnJf;Y̲H7ۏlHu%˄pIW=r\yAa VvΜgg_Dt)(~cKOKUk[jR҈-Z%TET[TZh3/L zIUN0R: U nlRTIm`jK[nuZ*FKc*uK`PiB[鐖^XlcD$b̒8D>bZ \ *q> 7^p5,yOnpvm{{-|aO߭|/eL-uJJ91㫦?{bT"JH-7⩊ayM1v|^^,R=? YhPEJ{(CNn0_@Ԟ~3NaօRo+b^y4jqe]AQ+MK,GcBD%»ǖxǙD&pe` % %Ċ& 1)-v;kQTi@l-UU{^Pk:otU&鰯a_sE|{k$k[ 2,Kz5,q */ΞgpG KLEqC{`H"k>[?>|\Z\\qKsn?TrER[[:ЂZi6by5h߻۝mtAlg.Jhx`>ơ+ϐ+}+g ,Bd@}Ns :.b1bl$ 1 6ъo\sPٮݜ}ڀ[MqI&ᇢ|umO%m^)={뻼82\Έ:$ìgj8_aKRмϴ+[I]圻:S\yx&EtrR@rٝHUVI`wUta,+ڛ<&FABܬu+?šcnqtm;nzq?>߳7?[=gY-ڮ>n-gz$05:{1hvnm)=.FOͰ1=zv.S|R) *$V"g+ﮢkx%UtyEs r/8FJczVѵ㙆Ha[{PlJzp$n<~_*:Xzܢch 6K57Z{"xMm%ǠDNW釾Si: M;D#?5yBD6L1\E-g[|cIO c3d%"#[8,:9Jco?A+kmZ ewzLb_ژ?-xT2Y*xA\W,K~]Y]څa۝V#Io\D;22 BeK@sgkaC%yßvkႫ.Ej/MHnx%pޒߵ컲9HۂQ;tmB];n1Uq2[m?_\rg{q$RNFr_q'7snw;Do? m:R" !u3зeܸ3bccRWe5CPV4HY,wY}Ew޴ ཏ^ 1l S(/|I[=;wOcr< W{^#fdNJ3NPˇ$BMIIJF6&#F*r(O=[~<DΕX/ӏyks6d !4(0DJ3+ֿBKC}7^LeD]jK296֟Ϯn?.g=ly>nyޮ_jYw,^R.F^|S/:ZxM3RIKg/?_vCO$cvT C&_>&czv^-iq1wv35vrFu-B+vm܄-[}sxUy3}lϕOEeWBY"N.8OsxFۨ q)|b`<+E%ۦheA(VJ2cCa0@}`8(0|S J[* *[w/7\z>oHv1`|v)numRMczM@op Wep9DB-Sc׫+=mbʜ0>d?.3{wn0#WzD."gh5m''CWHm𾉞FxO)5:PR JE/.tNy jB \Fŵ%.2f@Ք GRk1ϗJ*oV-6`O}&\eDz>4*5KzpbZ:;}yΙ pqEݡĥj5\.;>h]$)C< 'ޒ=Hq 4| N0TƐ^]$6LIYњ=[v؎{, U$%0 H`4l2l]e'n|^Zĸ/9xԍfexʖnK+gu{9i K ;1ʁ24vB-g/Ϛۿ A(.ۣ  |-lXiAR\V2"4e{ņPX6`$}z4J-*xZrm4:bvDQNKbd,Np?ˉI3Iӓd8l0D%_1re &nvf-d]!x)T3`GwNf}Ygi~ߖ6_Τ \P+DdGP"ITQΒr;j_TW5gsmf^+I3)"#^_:TDY#GZ}wapn(lJ|co?F{qvU5(62 mx_W *a&`BLpK m'&,$zOhk+I} @r!jT2Mݑ㙂+I9++3&TM>!  KaLӳ5`_zJ,f;D%hʼnʆoc 9Ҙ~j̆^ .mX?E_+>zOY7KO QLMP}~lCLݮE=HIs,iv35!7E CO@SgLC00R](-X {O}6q;ѾeΧM79"mնAhVs bpI2HIEKha˭ʨW]V QGHfpxCF:wl/7A)'!w*XD}"FP#{ZsHTDtԖƘ$?ltD"ʹF_;ZyUT-#^]-#`%*-`1jC2@r='*TH2TUdu$)ƹ.Ic)s `@r(m2$ѢQmD!].c*jŝ# (hTn* :S#)T]lDQ %EQY8i#GӾ.%0uJ!x$d*ſ@,8EZYUZPI>GN7k!JjW/s5|a1fMTbGK3-!sSQn8buXҠ̖V|BAl,)r+yP$܇D B-?ЇAdnFFL ҙ嶰_b=w|zmT3kjr58aux(hwʩԄrjS.f-;_ ! w_+7ZcEDŽJD׸;=ZKQAM|B6TZDmx[N-)|{of1F FTܛq P_O-sexK0s1VPGo 5˷t$iS9̲RזּF/koVNޗc¿{zk 1( (e Ղ3HV̲O"J;=(njc*cVj3CG!` dL0*i:H#Paσ$y#k2c.ޑRQsգ޸t e}UP@8j\ (j E@7:PQ4UfDV[VHa;GQQ佞ďjL5r(rߗ$K0YNCH&2.S+%8^3ݕ64yjeolr Drx $H emc&)VAaZC ] ]qql^ww^0SX]xתY[JPEƻkZK1AJ/֡¯D_Diq3̜AKW'w {\hpjp K}3 ^k))ID@| J=6R{10 h%XmKk #f9XLRKWkT`$! TGZ {j@{b4Z#Q=2MՐGs`t@`U=ؗ `:eF#tSM: DjO}ѻmzDwlhIofU)v{s/&`%HG{xO/'0l T1eJѠ\!z,=&rm|CeHS3j_lt:EʱR \dexJ(^U4JW}Jn<Q1x//SޟwoА'Q:%q{7UT9wAĎQǻ ޼[4ޭ y*X(Ht|/B`/K 3J xXg3H.̥'JAJ4$|= E;g字06$NHȰN` #9K4lb(bO0v{)5CY.:e4SR PS\QMUDC2{FcYU1 0_ C\~+ v]w:&h0mQb!jLaZ[dlӌ9i|4(/U SA {444$hZfS+9y~chM4ƥdwG m\ #BZ לLi)孩CVCYz,`m5nÛl/P2F1`g&!N:cFz),%# .\Pj U-,pG2X6JQ.]|Vҝ[3~_J[񰒊=l}1!fQL=pKߜ*ωB~OiՎB!,c9g<Ƃ(Ʌ|TSYc h1{\$^Q4]As`[A9,yt Kނ}Ou(yG;9[Rܘ7[ L#mq S[j,r?Um4f@mwĬ *m4"}8],OJȆ5VuaąP}3~кr ({AcUs~t4?$.F9j>,M"JZ/bl mC"NLڷ,fEC4zu:5ͷϴ}}^\biUjiE eW߷osYNEoyp=58)|W!&VX5@Y㕘4n/W[)hN 1{L2X1tG-ig,ehParɱJCTK=`SX@,KOXzA6+kieU91݃<;1~C<e)O *Y 7(p&U|xnA+1}40> ^A>b)>(',(ݏr4!W\OcG/K0:p!ohxB]n@%B6xJLez>qa:k? W!Ol >,l o3ԜM tPKA1Qge  .7yj :6 ڒԲ $"vOOͻB&/П/&}$vuiwr|@|V˪w[aL0ϝ"%$#b  !s,9 V؁T=vNWzh;{֋uV zm"v *7Z^O82MJM*1HMaKZ,,tmjx|{QG ^Ǯ!q<ueokPF0bӳޝes=3Cp~;A"solںd7c+xMݧ O~~uQ/_]#HM6l@=:3Pf6>Iſ<;5)rC/c[o!Z9ZoyƊwJ Y\w;P6\IWӮTA'8 &!FSb쎍ͣ5W0!`q $CiM؁om~Bw7f|0ӊH+S8G% řūʐ\p Di՘\9'wǓ]^]xvvaz:Yi1RՆ\ӽKzI+JS*7Wqe$o\slmdf4WC`u|9)a-0kx\jZ>1bJ`QD,:L0v ք0Grc#[Ja@:rWuy׍妓P3kcؿ`L;Pd'x>>;+a-ֳ"8 "evcNpRPkYE0̨BZ6zlSn6!m,wZ8y)ޑgJƍ_aCH܇*Λ-+w^o>[.mu(yL8f( ęwDi4 hf\%2:_ Kf-e5+Cdlޚ >dܙv=v/}8IL8}TpO>sxkCpB$) |\4SԲx}l%Y"l5G8?L/$!w=Y `׶6>ÓQO1{fb%祤Ƙlӂ$c>oNf@HBIյqq8:ݒ\zUN';/@kG.A2UP7nZYS`/ v;A1t-н[ E4H w6In&hP#:cTnsi`BnuH#уe S$|#i ݑ$vYO7.SxD渕0TCz(^cM qu}^ZpsvǑ|M;T>GYdL$lCR< :$>3x^8~PB؆j )A(\sR 'oET0%;PY%-&~yT7'9͎M}û󞣵7ϋ3BWx1v".#uLkj4¢ Agdc&=YH˚IP i&s0PUCNml aS>@]L^86le8klA3(-Zȩ_jHyIJt]wNss;Kr0 -kN^5*n9Z8h֛ObGbw)d8bxRH&Gu* ^oeU{g炙{-{-X}݅[aw^Q/R4"z4ҏ"Sfڨah t}T=lnnuH0<`CBQ|zI&=Z05 9wЙ~F뜻loy^^=uMO 'cÁ_b&+ l0*DR HX!1.D$R 1Z# ,BYG$if5Z@h!? Tf ~p>'Dy{C`n)}T2i9NԜ4D.@{L~-(\^ $1J2ۓEmsѝWG]~~]~jWkioombgv?,/'y擽_Nb+粥/v:^pRN$={J݇.C}:qj<ߝ P&&VBW W)K?+DY_&8sM@hp"j)8N 0NrwMׁҟT;KIVճhx_wE3cS?ts5@ pdQ R3zgx{HN(/USy;3԰|ygg w~̖[ߡ:5ko:VX uΛ]k$H<$exbi¯nYDh"҉C/_ rkͰ,V$Hb' ƠmݲX ӡ3?T }[4t좡*PZ8ucFㄉnL wF3KC֙\%ȸe!=&IFRMyYH l ].WcG+w(ݽVvUp0C6*:Ojvcs-aƈhujt*'NX%HKBgn^I|.Mǟe}y?}rU.܋9q\cYݥ !3Oo%عBWu6|JA~x/ٝf5_NH~xs(|aW%rYB_ԙXR͍@mKPX$c3=+$1`43XkP7ǠSg0 dږL}@DĐe%aS%/j7yeC7țM%@hmq@b@QR3xTTMvWiU7M|5ӯst1uCpT6m }YzrK[Rg/nW4ޝ[L:^Y>껩92fUC/zT7Eŧxt?D3>0½33Jy0[[hF> TR|{I;c<1YJ&~h(KAi7rF@Qefaݩn;-NவAkc$8{Da,|°MyY*bzgB9 g5 9A&YcgFh>ktDŀ?ihVo/"1˶ue)5$INͫ[@A4Kx޻r4Y?* UMyIw q@3Rh& vBV̺Uq2%Z_ Vnd8,P1ʂ#ZwNTq0l|I?^ bO68׊ 彮3$uBZBYb%a,sU HdbD!gUWy0cō 1rab-h)F5* qJ%VZ _=_ '4-dָն9=8aE%Qq{vh@Gw =μ{PX@/CobG ]}h2~~QگF!1/) ], Rg8|X2upۇgM)eYs5Ǥĩk(-N :Ji%IDZ  O4DKt&ڼ؜5DnH1* Ɠd;96J bgSwjif2ŹM]OC}WuC0 py1ց8_:FvhTs[vCBȔI-9v >:FvF?SR ɴ[0j:$䑋hLO.v8*4:v >:Fvxi`BnuH# *v5B` v;caZ8v &t_VFj>3԰QݿS;l`ӾΏr;Tc$sğTNs2(gΰ`mm xOfEZv>vfbwQWuc iPi UXU2/V"@ =[&IdW_B(웏;xgO7]@jDu2K]9&4qr SP?.e fpvnRw4D!]D:2id;'t9+p2mG5K?/g#[j)!P!s:#qLVVk+#s @[720I40JX,)f051eBb-B 8v CrǀI$7ԖGm0 vx.ofe XltRT"i,8bc$VJ +DRJ欰 A@ajpr/JDQ8V3x!A5M*򍊻A8N n#j#g 왱N}TNxP|uCװuolp bc)JuLE[0mwo{ƍ,En\K2Iv f1ٗ l9m HJ*QU%w2mb{:u.Dϋ]dn3s dsq,T % Y]_T kWy=cfص >y1ce3HG OtN/*:Ri )08%0N8LU&VK]ޏ v+~N%&IrEi%gQp&,ё5ծ¹``ʘaz8Q,ޏwRFg+w)gf79/aZ9>ț<כInDL)>/n3!X2I i"d2]\XðH+iL6 N΀b$OsJDb"Yuq5Ha~ `$o|,)',z%cj TctJR1$ lJ)p I>6(c3ÑE& 1L "E:QLIrST ChKk m#2f, qjwi'k F dC:3@!Z+gsYqsùe껐hѭUp@qSrXG&* wUCvPd!Lnboli=ϊؒ63+=p\Gwv2M[x:;![':cߥ Zu-bdoWbڦOښ4Ra>ew@Rl&>x)ij.䞬N~(KC:)[qWwpϚ!̫mro&?:SZS~E%-} `J9\ZVc7i)%NLK_ugP"㾈}nV9316f,!,I`pq ;ې3~2ۡ5|:]XTݱrr,&D8L 89 !ͮ7̼8iۊVxSk jij)Ͳ3R)<ψ\15uJcI45}6KfORTZo0aձoqo; ae7"4cg%Ah0}ltg(5c/\9/ݞ5O9&}^?o-mM/0pv3gw$jR e:9ﯞnx}>Jp#aboVbmq55 ?2.TnG7Ә2rv0L7Cd#m-+[bRȊR uS,R eQ~$AԺunHfEB0 wc˔AlYɖ:;T!ZlYy@f &B&[3Y+P5k݉l֎m,z[J\"_m)"_m쭇~"c(eVIiqb-ISL_X ,QOƩvag4QA1N!&ȾyYzB?e-58vcOMu䧒j%76e||Z+HK)&b;%|4=x`bȣdNs3lGmlCfDՉW?mdh&[+H 20j|.ufV.mP#6!{c)¶ "`!|z =j0wd9(W9Bh=Zm&Zx [_U9)ntyu}x^I x۞珀\R؏&P5<[_JN'Ag)?c(HIh?l;(u5&Ifb''dL0"8KΚi&5)l-'r:Ecs# ٸ{̬!ɖJlDdu7{{rFmWC^[CL[ze.t)Ľ gA%X ŽqE2/ܝiM3) N8?ǫO$~H]-q;Et)G{uA}1O?l{lSl.( L"ede)Ukdkɒzav_{&R&,sK#$ 1QT!ȹ4 RCw׆XS21]kfi(L9c7cV8J?ǼFI9E1Fp*ئM\vh0 [3Js8 k5X2E Z=x2judhd^`m8sE"9\ wKf ^sZrWt џ'{$o/_|~Ly{P4 DKp~zLU\\/_Uy=tC,OE*G#$vRf'xst G<=1:#wKƭ-sK vHNlCի.YV.(E+,Wv|m>W?I{nE@"(W~t }9v+/1}ؙv0O :-u9C(=-Q7?=:gx[ QTa6KP=^cvU805×醻]/N7nhtNsՅ{z;b6 @s26t07䂴Kt_'%,V ~ٺ^Kϯug𾨃-/wAhlN˽Icio\5old!HѲ[@9zлc`f^D<5+7YCL *,JNV/zsԔ~MR-Xaf0Y. `cLdT47I+#PGmx$aZR4j`NU o1fo}?bVc'i2'RgWo*/ -`->F E}yVGì֗\Rz]l!{.ݶBVX[^FOMqvqFZкBq$:cT+"a$w.=d v "S\B,N%v#E!C(Η0XJQaDk4b$D]VzP:r1<\S^2ܝ;(TuAE0t{xJ"݆\;ĉy-T}#{aBW*vqK,I/qB -\! s H`O6[#UhxlJ({ph"`e+V[EpI)Mz{[̑!iKk$~h3( 7Z7FoTX.>TR: d1lS-CsPzDp kB.UN${4|7;* En ['QKAZ@>X!Fiv<ye?8hwjVxUS ;) `9P eW/!s o z j1yhv遌T8y1;s1ɐbcl^ * @.{\tmdQ^HYkZYTTҽѡ~W-GM}PsKwY4큎FAt+Dde.G[t DPa7.A  fB sezsw$d*P`{@|ol2*808ڰK8 !_v-pCn},;+q8=uUòpv=fHsqxof^U˪x33ߦ=pQHDC!Ii+xG-Ժ62ElE[ s60';#FF* zʰ9ُ%Iz`Js3):4nV66 sPfk*|Yj+/K 1*b mD8GB}X/xMSMLwdDo7S8&2 Vcg 21Gf~v4gΈ?2fc w2>7>7T(ZmSIC &xyȁ\1j*KERa՘d|T@QCk;d"2(IRf$d*4)1 ax2NLQ-wx&njoBJDfQ4%o-`Na\tj1=R8-95Cs%m|!;Rv|Z ޘRV:5I}$=6'%%T7lq暨yy=gBkɨ[ $b\(hu7eƻdݕkcjy~);ķQ!&BaL%tz#NHz`OX-l솕XJaO]#$1 p8(I.1B`MP .$}?Ӛc#$ Wyō9J!*q^f+i~*ymz|9SˎC:rZ&T.(-i?oV|,}?U^lϒ1JYnK1n1vY8]Ij ma| M'#h6V(Co #]`e YSi/H~}pTj9NZ,L{ '*펰)6֝z1Aq/MwiPTQq ^Whz="~q\l4S;s/9)Fcg!B+`C9nG3]\.C*;MJ,yڪCkG")@ޓ67n+WTR/pS7dgj|٤\$I,R (Oht7۫zΩ9L"^=PY>jD QٛjT&r/UFÕgJ Zrp7T#mLLi -p Qvu8~dc{xH1!o7\& -nNʗоY{#V|[!zx~e+7Ib<H 0j`Tu!(PSy g岭U@xP|0kgAﶳlP\CNdo)l(| &D0,N*snKZeTp[ATJ;"롘#Q*(-{9TvLs Kp{GGE[ {I`+%jAbZ vw;_v@,8Ӏ!o`|t, ({5oaTN?+ӻ3ЦrX`oU(;c$.$ %vm%wƜUl3 =#C~M..v_l^^CmV)iyB">˾u,zV~}v:|oa]FFѫf_3(Ed85nj&QBR4#GL4\Bpx1uHjfWF } MjBl@,₳F""B(c*bX1é{3Nt ɓLR a9lN1xRIO' lI +,Բqj+ ņciŊ( K:mO&M[ U4z*֊*r{YUwC=C(q= ^FȀD`0E`lGX? $ko0-.@;)"i݈!n/2)ռSB[(43PSځ(`aNhtldy_Gi<d4Eam1 FK͆\ <ߵzX'yvi6ݙa*pV>aA[Cv(sc{q!5e8*L`p A itp !+ MBa4༝L0kf[9"Ʈ.Ake(cU.;z k/3 6m-ӆ5௕cT!@; v׼kzbli= [Jg $r96Uf0kV6 fk*ȶw4?輱F#of_Z$8=[V6JNE gÝxZHdS/tϜp-w%;ނ_wWP^v30)Ani&?P/ĥi鈤㡧|.Ⴡz)ӓ`iΤ;7Q rȴbm0E/EC eP[vVLeuWjZc։WI}x>gDN`38fs* Gff`xXF$ ΁ɩ!34?9 qM |#3jJ9;y8<=rx륞vF UK;/2Dvf,% P[c$9R#A8ٙͬsJ\Z:@q lPl`>r`{sȡIm Gdy^rt;C@O,= l?{MOyl4] ӿ#@ aGmz0iYb޿էc ,k h ݏjZx`KY(hfH-Y;5%r۰m.#·zngZw- <^@%Ii lAU.@ `Ƈ|:+{~ݟ1xN`|-MH(͟[z>]7ϣ!@D[S uQښ=/%󓜣$(16˲Xz$@rUT Hu aVs5#ր1䘓hr>гI;NMT  Z8 n׳;LN)a-mk;mҵg}3d aB|#:%gSb!>wZ7%J!}qEx< Hnvs豢y (g(r>s,y l&~ywǃjCBA<׳q?Vy3];RM1$yX-ۯw7u!`4t}=ݍҳx|۩;z&O"0W@x@YԞgO"<1ϦCNl 8 -J1,PJ|T{4qhdL2C"$d6L͍ FG x9jo}%_Iz=B;HHD!j'q 5#)c۫d,i pۻߝT^[cG.y/`4 f6yk/WD{tH`;/L:W|v׉;Jw4\,S3A5 LJJ& 1:%4@w/9pjG7BDK8U r(Ű,ZdHb%LI R,%) ָ["Sj' L0Dt[z =qlDܼݖ}s70v=&X(nXئy\>Xfҭ-Q{]gox 1?@* ,.NQ^,6 ]qƊ#NT-y88T]vao+Uz;܇~͹}:^D@#YBY\I)j?;kqA3gκ)?*9&sV%EEpsڍY/:|r`c^7.v(`~"}kҾ~E_;5^o>hY͌'}_{0t6˟1h5i g|.QkP-9c@}D`^S-3kladžx33ɕf /yw I0$8dzv !䃙S6M;n޷ak[qUwxM)&A.; ;.P i,)1mǛJrQʂd=Azxṫ_WՄk~I_[xK)KoŃ;|<6Οr3q&p"I8,qOc/{ {'pJltš99/:U +*E@D?^Vi:5FK9)>fqL㈘TDC07bc4dјRT`h*a$hf pcڴ;`SL1ĉ7`"+]$ eThgmԮD5W[ƛ{c[`o@]1 Fq܎i[7{AHGBeR̀ԦjʰܡkᏈz/nO0;wZn?OgN (lzI"Dqd:"q&=KUR) 4|'sYG5 as>>= mޔۇ;IͭpRH`BP`%41@I5KRH f803TU' OXAF= g}C,ZǔÞ 3V2HEUiILdy:c_DKB!]W[rH 3IfȜ  KX(IQ%Ո'4ƮT`}BIH끇=/2)` P<+1*ye棰!+P~wa\#s/Jw:H G4<Z|8P]T ՝@Bb_^i$6Xw[VJPV2$ ADXrQ2t&\mZҘ7}+cShCQR%(W 50-oS 3c?edh 9Hrx({.AkQ G>&q/~"5,EHR(рh%΅75~ ^7ve5o:dMRCy:C ,:M}u4BlM\n`JvpcŌᗁ r҄iVqix A!JUU[Uj~R/`n?{i.TV*Qh#5i^o a7gkL"+ׯ_ꅌaIXٻ7r$Wz0۽X`{5H&ZuуTUb̔,˰:RIF| Gv1Ѫ+GTwϳvMTr'XODXJML-KRFza1[U[Pt88 aI0_rm*bV$j)QʆEG+_Ɋ`ؐdYTKr i@@jǥu଻:Ժ`CIot.'KȽ!o ;]LƠW)'hU5Ȓ_%kZXg]7vZ$6Y.c3S30soݴLo,V]gHjvG=?tX_JZR/1Z3yqe'r۟x %W"ʹ["!+HķaӋC^[vjsB w;72;3xxy0iR`:tZgH?uk^VO/}rQS>pQqQ#SmE|t&x'rHw{_Dy]n)ћ?iF F>&Ofi^}IsWwhg@J[0k)jDwʸ'v=D.~]R\h5r\Nu3c1@HAhaS_$d]wPhfsKּ3@BQ>TЪӅw)5j4s*o7Hv"mY)["r%)6RͳY[>w{xPrkk {-1ȑ2DkR C15 4C*p,9c0!lș5-Yubl]Z "'ᓋX#D@jq#TG8Vi ;M?;ۧ`R\Slkq;flXa SP\خw4R~K75ʓ?A>k=GAKmw+1ռhLOz7mMF#Z\NmnOS({޵u&4պ\Dd ?z[naN Etrhcݎw Fޮ[6֭|"%S꭬[v᳖MgW-cv: BǸ<(S>`rq02+5b;(2Ax 0*X_:S,L)u9WJ0.sHl:})U>t8H'@hN:h*-!(ȢX,*Zg3u72Xz^̮͊xE 'oaW3A#|dG#>o xx_al8"S 3%j΅*՚Yv SM e:PV025]8ٸRjc;)Cl:fT-DNozma=0T+vKKq.ʥQJr#3&h0 1J1QY`ǁHFV|Պ)_HKBrR` b%`0a0>LVpAER.+mj]c%o7xچacc2 p:6];7I<}V܈K+ ێ1J7`nY牓2\%Dն4r aA^!+ d84O4v`'O`g #GdIt4)l#W_}X jx?f/Gj1ILHBABM-f FtRx19'c9'8uUk1W٧-*Nuُ 몠kFw- ]+/]UC\I(UB(h[+ k$ek\ @WWN.gnH1-"`>i49=҈:nb|*{-&R`gˉ&7lR1 (mE۵`O$ ^Im$Xbm4PXohO$;3~ Al^**CH}آ ,E09iY&?*/j lųy Sw>݈ל(WlA9h*uwI(Zu‰"A0f,wzŹ5~026s~Nhl,PDXJ8Ä+60億S)qJ9tspeץ[ӓAlU~=(]S xcS;W斂P]5V`>gԂ8ӭp+Sסc-4zB`M4Sj漢ZPNOfqV%88&9= e.T N@j "ikc6ΆSz6ܟx&P1Y$:..Х^pVqO"Ш! 3dOY1exm.CD/ 1hޓ-q\+(ܛ}QRrUb8yN7E2(NszC,bz6]AK>bԻTJY3^~J̮?V)Q9oQ(GP.ıΏNG_/)A:?y"m<2g˃Zcr"~:rr\q7e{F'w3@iY0xR87;}N:eWB">Q"j,_XRR.csV n,T;.@ Yܙ{2Jh uLnE/h]0ؾQBħODAb;V}kQ^8ix9ҼA;X.j~=ߥ_R]I=5\5QyF@3^I޳ƐQNe{ebqxÀ<]ypQFEOFBHZ/1P?>8:\$Z>9wo^5("t~$CT΁[}8 "dp0Je@f6|׽oΕ7`JCs ;+?̱Mhnir`=R]2sHc7au'Tܐ(3r7)B&nHUѯ!q F JDSG`@8 * 4Pj1 1 D!i1b:n!bdk[:`y?za"kѓw_5G f4`[+yЀqDJze:*7Q% J>/tVABrg$A5$q;$,KKP" LfNfNB:LJG}`%Q6f)}`UB{e_HQQ(d["f <jƟVttRY$L1eJEsHFFHnIWI>$.xTsK84hq+A;KN * Bg4a҈R&~%M#JsZMp􀋒Pe#3@> FHA rb EPԂCB{$: ?2QH38PUĩV#'Am$RPnȠq;$)BxRea. Z# ֘=O_RBtVtSrtS^UxN`Y?gg){g;sQ.Dg(\u<~2ٳ='삔3QD4հa]I #:SRpg!Q6$ yZ e7)i R]AshV 햃Z'y{9x8ԃICD-˴Qt"IQYcJC2V~FAv-?ZVPNv|"]C@wGz~e:NWJzxԒwHz>T:ҊbäTtwN ӝJW.c:#AJ!ke<@JXI98]skkrJuLJ||sݿ{Maꭷ5#]>ݹ}ؽ("SݤhhKD[xjDCd]>y{=,!ac#=)BVgc/n λtˮgu8uzswo ؃A~mݵ_p/h6M+OM],Zjmf([$CsWXoiYZv=KVzg~Q!ryCr,Slvۇrq:knt$nـj&8Y44|"gK)Jrq:kn=&瀦\-T9#j \gUzK <`sZ(#T@z"^r@w"Aƃ;l]5'xbĀ3c'&)"w:NLhZҶ(щiCEx2{6u[i,^AR2sZ$ivF\@ pȑhO D$u%bZ43bzh;Цj*S;<8_d3Y [fۼ$.1B[Zwz+fKza#"G{葬qNzQ oP*B0+n ]l /ڼma?qEVuH}R@ BrA\zJ#ڳ(& P9<)O!]3N$|4nkL}*3qʝ\'W[bATQ**#g,jhK )qΜ]v1Ȅdnـj&8Y4ܡǹ$уƕ]v3vwΜ hvkCExx=] M+MZ 9`11f fgxXF@V*y>dZfZXl@b, Kˀ^)m7Y+!mFu Ayei^_DQތg7ՃxVKbq~:~0VKI_^r/Ɵ`Af]OLc1Uƿ 9r)v#Ce u2רzs"M+]-d9w^3q8sZB(L9S6ɉ8Y4؁iɗXa4(μ ]3/J5:445Ҁ`k% . }(q$ #Qx 8c\!rMCE11Fkx*bA? "qѪò;dbzzukTAj $QkVNKa'zMpȑhO)6xdL !㗉tJ2~dJJ#3~&g#g,94z-Arq3sZ=i4Fl@{6!G΢<ūÅw!|X|6'JsNRTw];yw^<ɐC?L[%Tj*vײpYUin۬L=Py^GRavOs.4B&×jMvZq,u0+^ufH[y x5/<`^e|KvTqM}A_x ZU/ gрB)ҊP Ay.9X{#KpQxމQ̮0 /@3Wn8phqCp804z{JfzYM¾~01[:f/(p8-LC y boa#l ggO%. dg'1h]}ZӼ/Ft?GZo:;?,Vއn_~H/ή>=$6I+Ĭ&HHP&I:aI4$w,\EYwwDb,:')\1;HBp$iTR"QʼnҀ'>Ҁ'x^ I }^H 96@3cI΁AҬK5Ѯ=24P5iº`ѫ4a]$P^zR֗o2nCd܂L߄ #7ª*MJ $]_~Rr*k@Q9 0,$|N E>YpZzhh͋+g?U6lq52~Yw2/'sQ1FW{2@UgPKoWTIJ9l6k/ #>l,36qdǯ;ٸ g.L $3깉Ӎ2/T2&[Q]_WXj 3J ݆9Ů 3Vzî鿎a׻ͿS/X"@ow:`rIiL.yD3NbI*Iڳs bTWRoβ1>{ P0|w6vfjx7-pQBXTB]'y8kND}6؄yF&dfR&%@p_\\!IʋSIҘ^=kw!]zg'O1i:%8ZD@#''D)+Qt卸8fw7%u*'‹rJ M/MTU !0i_> /0ze{.w6i̝MclZ͝U XTj %/<z=BcU`qa /l0+.T](ˀ.< Wq1Ʌ-=J]=za1()Yэ>Iʹu`B9Lj6(0"߇%=arx|xs)A-u+%n/ۋMr%J­zXŔ{CU0J%) ظh)2JM 9nF!]F QlTf!pihfHĆP2}Tch?00LTۅû6f9?AşVF(-BIDd  v*G{QWSh ::$8K{dWM|+~ԓD=k<rIfżЦ.m$C+9^p dѼHLY1ו+ R-̼z4!'%E^]WpngDH0Pl=)kI2 d'E`)8F1R+'*9aL [;>tUTOʻW=-HE'J"~oiפgy1O"Ԟ54GrkIh)CK R:J{]P9F/hPu{Nҕ%v D܋8Ö;/M0JL@5!. e(LY]tӦӘ6ƴ鴚6som9,[@hǴ)!S,=.$;0\VPVYysbw/'A;[1bm4ڊiVɤH6ά"A N}"ȝFB +ea!= Je} Q;] jG411>@EH庤& {?CcJ NzS~>:8H b8C/ 16Xw] փ5amm?"|B,Y=&$S ,gҋAp/Ӡq%cP6R0j/AJm $+wS.x :s7S1g AH ,5Z;f ^:D 1csS[΁D zr5`UZ|( 9E Lr !#=BB?PQzxr Д|獍<C2;8,WivrP^ @: `!`Wx$f lvP$@g?fL2d%$y=JjT ;d =!D=#P1ջcԗ!Cp<dmA-0{* M?xGC EH`+ ^5=)D KC V0?n  IP ]ĺZ7|QW}`CT`P9&=| eNFx{lq/bSsqϑuI]Nٮ^NSuuJO5.K\z@`3Sj"5N)Q]hvQ84{b\@ϓ S'%A;YKbQTLq`x_qIɴ@Jmu(SВ[mNw)SbW<`^z>Y0\qdmuzǿ4qk)OݘyATNc7Hz?YCpgЖg"ŷO,ȃvw3F7=uB0Gx3<=|EKl HX<5ablo5F@7wuz<]Ȏ1X“ͨώZIutZ DRp5Rӽ2*mST 5EO.GPۙ1*TR>usNY흽Dwˉfy[߆5۾¼:;Rs*}Pc5 󇓯>K*ϦY%Ԃu7ł9 ˭.":F xjmq9-c1kQ\Wn3v'Ma <_#ѯ2VƼШe)%ھE1OH)i{UWVՅ+ZxSSUWw#|pA+i=9AD 5Kׯi0PƭJ e!RkU.\:XJdc~ 7O6mW͐sz4) Z,Kd%`obq9\Nd< XlC;)wۉHns_潉1)j/1/olr4cWip? G`3D,*Sʮ|kowo\ ЈRmnosVR3}Z?8Si)9^vZv{aJS;: +e ʼ4 O!.@[{`w.y'3$Ms0 2ac }Z5( \t\bܥ`D.Ӥm]Gwo+c (2{ξ'o*Fw H"d'I#gc W7Fd^r|= 9T aŗ*(gtOr-Yy#ݍhAla|~]Gqqjju%串4X8גLc/?-H}žN5;(Xf֯8Ž]j M'0cߌfٿ*?Dž9Y@HwL^\!%H%Jؒ3Y fJۣOfˢ? uOˢZ?Qk^طKRO:&ۤsbRRU5)"eDx:`V 3HA{i4 q"VbqMNԨ `Omq"ŀvE3F;k2*wNKOBJ~/b:!-E)>I"I"2BkVL@Nsڿﯖctas2tJ:W [^Zð `I`G&!F: OCzƊ@ONt}}kl櫵\b RI0`l1P;xkY\Lb3Ejܧo"$q_dor8NGTPJm Z 6;˜,̱>E^UUlLDΖ+PglB%JK?Ocgm"_hY"Mő9Sim 1NJb>y"Lހv:v^h6Xnh O%XI-2~I9 ؚ:u MHe vͩ"P"6H!h&h$ cth % H(37Vim`Uglվ0t`3gLS04; E}<֕>QWɏJ~U򣮪~Txő1Ô`LR2!I'`ARHJ!yV}Ԁ4NJ֧α&Z#AzQ D"T[Jq½Hep Rr[P:w8)@dHF^G}׭8#˂*EZFz&>pVDtuK1bPL, Bۅ|.6՞>qv }j0ڤ"":˽J,-|T笆p."snD#<m?y .9/r0BԼri`PQԞ^!ajFP!2m-`c%襋QFU8 8cYùwQ+ꁇqm4 ATIՁZ#|B7ؙLNBͱw{T%DdWn\&RkS7|ni|d|r<"wO%ol^yfW5~Zn!Pm#Mo˖ZC-.?^5mpv:mM2]$t`FNd[ey1Ut6bXJtlԆ=es9?crLa&$< R[Za|8V>( Ɓ}s<Ǵ |"wF ǥ Hb"AhI7RhG'teb FRpEkv) 1 bB'[e%S^|l= bGu60C)CRm54(Z8+Šyt^8}0 *{&'o`ꬫMN7Sf4Ero |.9jTakW%V@Ձ\hw-dn5Y̿C5OUpJDZy2#!Ǵ;SdR~] r@IwBHwS7*~f"C5sYGZG8gCݫ%|'r\&|/VM)ݼ2|/a+H3o A#.5 BOŠAA!+:g b;oZӹ;ڸȦmc^xiۘ𱶍[jZ`3#A'(?;/9(0"5í~<.ݩV@Sqj[uY{\BH5Ŗ$nV|4nꞶ䗝  $]D]H$\΁2FMO5Ey–3'sSWj)Vh4W`rz9}/}Wo~tG.Gvse?jոuG.CIo|ϏHC]D^~vy]N"jO5'ozJ'o6plWFM\DoN#>$ch2D'}{_dھ\lwg&nfSyXa؆PymʼnJ1D1~bWonK%n te3pj $g.+]*8ۏɈ;% w3{^w[b9o`JTEx[q&H\e&Olnu6H]uih*z>:'ВiD ?x˳jjݜu3Y $m$-Y;4JK)GR,QgIȼ'W-dޏCeeuBY|.[v( pq;eQyC'>꺖`O=J`Fj.$ x01{mTY`ʸ@h$^S+kj'uǧ7%xRw/P i<͸9Љj i\ksdn{㤯+B`|:gZ$ϟQܜzqK]tom907A l_4P02r$t+YtFQy&CCm굳 @f:u}c*wku~EEلhhoח"^ˈ؇OV R7f I5_ OuY[ :SFJV"YȕzkdZS޺$>ō_ATy8s@i`΁5}V .qRbJ-(#[SY PΥ Is}L&9KH#= p']?H8an-ozwg@&ޝuѫڽ<ո['$cR~/+tHsBfo ~rDσ[!Eҟn$7kǓLVU3SJPFxȵChdˁ,3 s Jk FpmUcJSќ'lRz̲&%(w`UAkUq^M18F~3k шd%qK<@zB2VG!3^M6 4yj.EWBfzEd-p&>%| &9ϗ%ZwXgSDS#$KjѴxW&y5|yRq52i&t@d;v9[''yys(:9AKtr `D--gKYKzBf8|T՟$yōnJ VѫaL" V'9Zhm,u.n*e2UUh?A('Ktevgy.y;ѫyj\%ϳ&ϗd <%Q Dn>sDqsJ-XO.ŨٕL.lNVMsY} W~ P>Y|2<'4Ip[תMUTN}u{-}:yskӌKR ЙiLᐆ- 2w{BH -R惱)RCvHId [u@kuMUNkT2K]4{b20rh46) |鍔N/|9[nFJe&y)/')%LY9gKFJ|kțk4hϟ d#%Th5Z"$%a&κf8G98c"8a^D#S.WAh HX]xbcHU[,nZ+VAD$&d7TM6^9DF o!i&tE9N#xX9>s Nyʂ2K] 18CѭE]Fy%3#Ԋ3ai\\Lzpc0s8]kRQ#i":8 џ55\:ڒ+q)&̾,\JT1;d"KDM%@UPkgKY`;+gj\!?"?d %Q-Rlp礛gbx%LI,'^LdT,]^Fۤ-4**.-6]$O\ * `IpB~R[k+PXiU5*O1{O K/%)8Lx(Q E5#BmQ}kT2 >z2#8‰Xf4V+;+}4x͞j<)}`UP3c:|ZVdf3A0٘2'Z(2.Ta :r`OLLV`sI)N:g5cAT]4tRVXh7yEڗn3ees"rlpv+z_Lwv*/'"Y녻%X{ߖ0o+;jxγ+smB:9sOI&ĮaѓKr3`J[E#ItgۼUQĶ9@3lzIMQ E13;jɉvtl2I1Pok@` ~ѰȞƯVTu3/уzHdVzru+æd`i۴d|}{NI^b_t_!#ҥ㩾"[m7Pt1l{ y}mܗWHwq5%i I`ZJ.8 %5ePTc_]E&F Z]vz~oM]e`-J_b|%2+vŔPʆeYPE@P\ͻHF_[ &y)~s_JUfށ)ܓb%8Nzm+,/APƑ? j>~θ%N_@>lE2*>QFB><"Z2=r;KwW_\ITA(t]A8KxnG4ώhؑ~/Պ+[#,Zοp{~._kAL%l?RU+)Ȍj;KrXڂ漳U4^'=^|ҫ0W'eZWƑPqPQm[H#1V(umߒZλ6#Jy\ZYp󥅎)YQJ]% U[ɫ-(rޚ%14?6Q=as H44N6N:2JJj9x+RvRc bup֩+~C62yΎ$;rI#'m;^Vk'RQTTVVHD|Zkx\nC(}Q(}N )ċH? \UQ롟W" >US8R9ػ9Vf)@s EVSVs[;$' sI$H>zغ!'0JH9LN9( _!x͍ uT X;-ZD#vz%36~կ#%27I\o'krKgB>P'j虻]nviBÒ)ivUT^u?#|nػfjY7y9ze>_HEq}Xۦdw7 sɩrnvfn_+<86 &<*ᜈTƤ~7\1ۋ4MK|Q+-?.RSGx4??s ϔޕ6r$B Rއapf<큐EHJbFTxuQfHd2/2<"@)B/){(&1{CnRRbPŭ&XmA7(bwa$ք`)oƦD-!Pքcͬ@ vt |mOHKh*P\c..՛-?E3f.){#Cdn_kX4lCԨIy.IW (I T!̑zݜ?L&ZB,MJv IIJm>,栘k>@ kӊ#y0k2HERIøcZ∢TZ{ ى#>n+MOQ@LB 0ɝ : B,/ 9+L̉㼸fK|7ힽ27{e9,!2"Ih Jqqmoq f>F@`>Ҵ戳 ۠_lxp`QAU*Xy@`z¤QKs?t2N.@#)#rjjTrB";]f,vzj 'XR])LHppg]ri ]aZఉ82 K、9RUänOͳ‰2XM 3,΁իޝL&`&eҦԕ`4::R3DL = (|~ bחZe0VgVH2 ހ؍J46LC=1"i# `"S pBɁ3Za1EZAjo\Xaa`R@tQHNEA9A<>)qmb~:Be<(=qmq~kTrRqsWI$ *^k0h6iRHYM1[YTT ϫxa,Vy6P7Z Gjxƫ+HۉGAhyHDqTK mP;0[L]QU,hwJBF7faDoj@S깐oyfO(Uvk\o<}'n;48),#٣rApm' u4γAy D~Q]=o: )pn*U띆pjdp;5B pֳtܱ6X/LnW/y;: 9'魙a w\)jUӯYkP-w^wQm~ܲQawj ;?*N+!Bt4Ī>ԐpV~Xk=9jO8˵Ӵ(k;M ЭiJ\bRNB-״xAfJKLwoӋ߂?6''p,L.CێKKnx/-O_h[FhǡÓY̩1\MCvh+𨉶,un{j4Bsv2iԉV&ƨs[{.iGw.akI?8uh?PDuմ~O3=ԞmڡY׼͕s[p5Mo&ZGZgP;6Vb|,@;L5QX;ʵ've[~SsV+~`el+mGvT*aԒj|h'nD߻Qx Bڈk*nrӍӹ}FbF)z让6fnsIkfZʹOI?e}Bj9 ӉE^4T,L&WɝK'+`(9 #H/zU뵻e7\o0`1>$E$FyB@.4IKI$ƽP " G0s$^o?f:<8l1Z-̶dz k"U&RX`-Ej=(,@!=2e~NYqr/C/kԳ0ι(M\|ۛ~ݥ J _hJR>粲:ߔ48_a!Ǐ Zwyci4 ;K*p+];ieͯtf%0ƣؿ0~9f_Kvf#d FL7PQCG &=;QeXTIJ9^T-N$'%Fyvv8I YB0yQA+spRdqֻA<8[a**?0w>Ux&u瓺%W.xV_s*>"_:W9% "*g_Go]@ʆwU~ZKO_}]> `Yaxi \z pA /s~6HO*@̨g|aQ v 3QxG!lρ2Ï/z?KemG3Ozq8NoR=hmZF/ ?QࢊCu\EJ9R ǹ{cZL{p=pr d2NWp BxKMZ4p+۷߾x'oH'?899~]¼@H8Zh=Ik‘1I!aFiM#i­~mja9zwf2ME?Ֆ7p_fr _^W`du ~08)^.~HeHs"rQ.skB+qu\' B`{C@B#FN!VJƐ*,m$aJSPGJ uYmЁ{l@mLV#Ƭ:jF7E}M7mmȋUUOlIÏOa|&W7X#ċXN@8 k /AR&z/,& `Nxg!'obsDlUP"(J & 8~JPZŏhutB'!2`,t (r>ş]L@ Ftusr٫faEXkE> _`h8W0à1RhHP9猉Ȱ"RMlu|/u8 B /= ۇ~o:p7wùxN 2D8U 2f4![-64P A*{J,a{q|4p ӅkKS-%. Ons\0ǹE!הc Nʜp8! C )# d"cZk8kkp4 hRNm"<@SIִ[y$Su!ϜExnh7 ۭYclE,M[Fm8䙳hOIQU^hRNm2oMo4Wu!ϜExJϖj7%>hRNm2@Sݘڷv+hvC9!{U_n4+P uJhvW)hJU(hvC9Ԧ5C[Yn`v+A딮F hִ[Fs[Y*pc$.[!XpVm9]wVlCѡSID=;PrTT~.ePpjR E!!~Ȁw! 'C-@$&4 CfY{LH&O!~/ɼa E0{i qM.a6;NYkoyֺ~ّ$V)#[EJE*AHEݍf[KZ77"S؀)Ad#hliGb<|g830oG3s?A ޜ}8z_6m+̟z1_ЫLI[&߾Fqbl\餂'i mE4\ꗑ9_ `M?M 1e66~t>ߌnwStp3A׹<ݓ%#8TX1OleV0 N* Q `"GQz)Xysl0G0O]@hpf!ntv tˆ!Yǩŏէ˛ .#+!0G?YΕZ"cW}8NE&ۊJ+"pyMA(s[SD5PpR }ЈJ?;%CjLKW- f/-zHhP3h.I 6v#@k?[9Ils$Im/pq-c,D|thʹȩ$ݠV֦Ua:ɌN?nձyyss/>(]`1tQb?-zNr3_x>w&oa]I8]ݯgѢY7ȩtH0wT(CR*1]Ƨ BQpl-xqlpdʋhBj!I;dïRi]A6D6rZĜjfOUNWd mf` hRc[5ʁ{:*e a>I^aibEqT`W!$ T[a5Ҝ?jvI@M$Y;idiDe*"Κ56`a(,u/S+5^h)a~mnl LkF)z~80[R|t[lGZ4#sF_:@9sBHT]T'4cC,$ejQ`%M#OG$-2e6fr+V;Ҋl=By:>u2{%Էt f1֐9w3U϶/-AyZ(p}I<{T.W{jX[|hg0.e=bݯuQǻ!7_닷>7z$|ⵡRpWݾQ)$F-ܪYԮmaK=0n]lj}j4Y&5rt: ^[_`s DaacZ pcnqLm@22 4a $%ŽJC ))H EDF}J,Hɩ5cW_, G˥a:D)g1 :Sj1}\ AV[n.&Nj?W%9/\5kg6fq-2N^u]W6ka W)9oIȿoA|Z -\wwXq]q?qTgA>> Vkz*F~/]S j3W_N&rf&E,={oY`k!D77p2#p0)SFzi۩:'- ۽Yi"TŤdZsoZmQt6k3{}cO: Ή[ut/1}bS%港'q(vˇy)9!='L2Y&hF>\h1bG8P,3rŗCz<O7^3葑-Xi07m#̙ ϼP B%5T;mdH/"~ j.N$|<&kӷP||tzvu={ƶ÷_4IG#8| k{p|^h-g39*(ߐA8Zm1R#*:^E (ѱ`I*(PT95E"ÊiJhF$B %PFLp%zs %YHErd]QF$C@>มT9:=T jZ ̔q_r q;)*`{JW@Yvz.#K əVBs)'yu -u(؅DBl#vjRE3bcMy%Y5% 1ݶ[v`߰ʿh*8R} Q 'OUTce+0Y<`OǩB]KZ5fߐbFBaǶS0QϹ[C, ٧^vG+--=:2<^%#t]=ǟ4h߬tVNJcRbZDD/^Ӷp΋"DGE~^{⢊+c0g4zӫk,hѠVrYޠQC©G޵5K*C @*?9qORapD-E9vƐ7ùZn@w .Kqºljnv85|ld*ݼN0rPC1 NG-]!*LρIY`̀#~8tvW|&7p-כ=&Zy;QاMC?qdëכ|$-m8MJH[^/>qB4g:+OMb65pqt<CBt:tӁywvr^3,z/gnNp Kgq`$o>yeޅ:VI5 }&iۗ,jvuȦeduzLI4sLabJ/LИDX ϻWqE%$N[\_~mE [a1),:\jXn: YA w2@#I-ڻ@"@ >`#aR+XVLPjn:E{TuдR %!Dy{BPfeM4* ƾq+py\( ;MxhAŁ:]J:F*KCSu N!vDdE @QR;-fE@Ժ(piN+uZ᮫+\UmTs!.EVh7' Ak_ '6;չشFش80S:}[2}99edej"1!^2'?Ȧ{&@RB4i]S29;u\hϽ@M, E VUXDb X)M!EEtaH);yN@wýrj*]:kJkI(SE :,= 8>&6kW hNK !@uA aNQbFE(Mle؝*+);&L*+~=[eWW?B?XuTps$(d,ShMF~0$d&8dJR|qqio>5iZW>ke}t`tm(fuV(l>ݧ*/6K&q[|( A+ѨN./ nB⮂:4Zj T9a &T=Mi&m&Nj(ߗ|V ǵXGr=wے +,fըkcZpy*f_~<̑۟9]=AJ $}8< pG -Π]=_OY0oi!ç K/ɿ7$ߞfa:oCps^dl255?_mh4o oqaolQZ[mV6?;5F_PiQE8]M3ȼė{Æކt ?7Ar݁6Cl׶jsj_6޷ߝ 4Siv4j#oBb^Dr`QZUVi*<ӠZr4uZӜTߚ?ZJ#s;fbZd!tõLk wփFa:BQRD 8}q(t \- G6Bje<>Qh0~XIToNoszޒh|@Qtx-RR!q=3h,ŭ [hF+*WcW^t_}qqQb4Jsgh<+ZGe.FoUj e#b:7.i nLV8r)ѲsPA8NqpmhJGÝ_K5?ɧUI8SW|RCa N"aHP( sqdMc=u"V~дyӀ$<8]D{g>C"~~̕1p.\ s/a`ìBܹؒ|"H*jך݄05 GtQgvuI]n;!!߹6)GXp`05 GtQgv; 9Un;!!߹ȔΔ)=BSY`ݪrI~ 6I`2})?Fհ1vv^ގEӗ9u3rM(ZM^slA iSg -8DQStR"rlfT]V:sNρ0lPl 4ICHU^A.$U >B4hliQlIqNnAf#QF@K*CX-BY o>%nEFURp}; bm'8xWFh(1h(y @lB9) bMiYuЈFAJC0K|+>ѠYI'M6W#M+1j6 eCH{ľ?<%qOt̡>wNI9< } $.~~km kv+ 0+PȣP)vȹÂF5,L!<8.ZmpVsg41?Qh+(4iLgb׉e^V#~[8SLHסʥgʳyE)eG,R2)l~T0xI{b/!Pq~p ɘ:]Wf12:$I*bVZBE`. t* *]8#G/d. Zԁ@E7R!]c{i #v1]\!DQP`S.0ŴFF qҢVB7y ~4I/4LGSCʘrePB  EvHxNQ.`=3S @j?*4JM(Ɨ3S7BuVn{5x/~gS[K2.Od<=%}svZFq6Vŗe5>y1evXju~^ZߙX,=pBX!,Qk.JA&Qb!Ph7VST5gE˔ׁh8v,J&m(ZX 7ޤ@NXwc4|QcUwk68W4 +5@6je8ش#D񢞛 Et{jC-vQw孟?ug.sU՗?s~{|N- e[x'#YXoi!73gӧGp0\hox1C]ۆzbf7mK.Lg-cC~RX7e@G{.yk>h/3 Ω-O 3QzQO.*FyV1j _QXD92K~-?;+Na ݻ큼Ub5\+uK<@@ikӡ\p0zu^jL,% 6)uSiI"+ 崎H>Ŀ ZY"6 YDLw|rYcWq4'#FD@;̒κ975#JꦶgXc諗!M@:pT1 !g@:?3K:Ɩhv: OX_]<)8?u;ӏH0Ju Rs]ִ(9Ly~2py3r 8 gW7?5ǹ!ғ߯1\W#T=M`4{x1p긙./i*H+WI!J~lo30X~S?Iòp8b⿼iүs+ =G&s רyfq1+'mnFekke܇)3yv_rl @YL߷AI6u ARGS9 F~ўDMM؋R7v4! ;`#JwOhy ?oH~r7qdGS( [K}_(OBR9|>nۉ?E?ۣ(Gj|6˸Sh=(F4BRZ13-PY'-?l~(rʹXS_t5X͘ a d!,G=?k_X9a*Rc*SKص[T#ubf v-p *D¯(6B ч2Z8MPkK8 p,(XQVաĹW b##[+*Y㽴1c i1V/JR/ߒ)dD_οˣϳO;ݵuNtw]$ A4UHs 9Mim@wEF*HیYQd[Y NnUڷ?HߟHIn>&_~bԁwvQ/~^"⴫Vk!tO UnY_E@![iM{66>Btz): RC;Ax*,_Mn-h)=ā eTc>hdW:&i- hVCDs fdzruWHʹF:/ϕ &:tiKBHUGR(yō*9%0. ˅˩qbiK8X9yx`L-`D7=ioM{E&S_% w|ԽCFIFmxmdD<܉((, } wB1#G" @~w"BFB{W4@ (""w$ЁCQ5lft,E. .`ud*l:UДy쓻|]` $Ւo.̇Q(_FoV(~gSr{X/ڔXdn5J_KT!'߭X$fbٚ@o/PhV[Xsӏ\m%}&pNVRPqۃ뿁?À`boW^3!'D r L`!8w+.|0Rfr Ub~Wc%m)HmB)sʒ5a } ^Yv09W/SmFrXcf9VK 1 VTb<#Dh|^ \ZKDVD:5 捸^ ʅ2X#1[.7,w>9J`DCTZBaPTzzs#^|]N+.B- p!p#B1gx׃sgeB;lQ"CIi@8@s%sH P)zz5I4hMk}Gq܉~$㺒rGK:4_2l?[W{ߧm81|<܌O L Gpb˰ m0R>|!~0/ QZb6e {d'ZD$20rk}sx~/#|r<9>$ iq [Fm%O>_l2ӝut._Q^ϗL-oy7O30[$Kq vf(1>_; FLߖau9=L2Z<_iv`0<RtOpww~^~&{8Ow$Q@J+.$[}k6qҔY0JK)Ed$+$R4,I-(:JI}G#Ow%Q[z~vkCBrM)Nn cu -1:혧%M7nmHW.eJb"Njs_|l%V<0f_yP3O">4ͬ,Jh/ *^5$[o۱7Ŭp 81rGFc%Pf}4e`} +ƮQ_5v5 ld6&#ϡ9΢X@}E$@<˶;эso>xOf>LoKx4$&Ԭ\btoz5:v1wC}vYB*_Œ6Q) }rr+OL)>%l*AR*qi;fGn6Rc^xS$ æ2:c % y9TK8/xqM$5y2cX'Eu3E[Fi TW!LHvrMߔ> |#ʫG=u;kZ=XͯVC 4Iw=wjpH+yps '1C,K8Dbb(0z= U1ĒSkYAVQ[rQ.R8%, ?2EiC,_XYTq u<-1 4+iAY#a`y<+3JHgvpZ=(D):ȱҽT]L?HHk$DűCȆ7AE SsrG,K_PĴP. h8i ӹGl%#gaVTׯZvV LiL2h0NI ? k*CHy s"@ȣ*FOb1DHRJt'?1g - cTkN~ZPHo Brv8("nS)\0am@3Jf?3@0|eOQwߢ)BۧAlW?Fez\)kO~ `? b|9oF??)O p)Ɠ?ưg2f~pX\Z9bquyJv(2L~Diצ>_<7LzWrBjl%z۠/س9.ػ aXg.`&/HNBdaR;kڰ~(P˜oՐrss?[ p*yEoa: 1x80*1&R͕$Wcaads2cfI̟g:HJp= /;)`Cxi$.bԁwvQ}ҩXQ,sg٢x|Z-K*6o$?쫯T_ir%_'T2b%6vs @LB>Cz aL6/6 ȱ w UGK׷8uQxJ #vXŅх\PQ MtnX#:$`aQ߃,ۼ|݋3\JIPL 4wJrϑua$,GƮ Fs*J8By N8Q!HP,4:U~[#Q ~sa4sh^` FRɕE d1iX\EA&NQ^ύAt85-NY:B}\n}TT;aI?Hf,OkUÒ$-evEì պ`ý -]~Uc=jt\FKìpTfs'R;sŎ:sL["R[F֫sUKgM⼘e҆nP(j;,+^iLF4" D[Ԡ`LcG/#8ymW$̯mO$a@ɜOZ @DkNe3'5:q| p lw.%D4\j&'Hޕq$B,y e-ؙ!zVF#On6fYWw WV֩S'MQ6Sփك y"H4:}zSQ|ڭ)9um/);vk^M"S!!\D'oqnj7^)SDi:^L5ڐW.{ATۡe:?-cHfTACRUCr qV+#R R(vP7IkEF5v{Q& 3(O8"(!29DrF"H)E`2RmTQ¸`8-T2+eurDiZ kUeIGAf}.(<#ʏ-.jXpN_h0 ٨tq<.~V% goehf+iNV{݀Vfa&KpX,y=]Mv ?-,dXfM2Iu;s&u{n+ "Վ*h wB1/u 4ЕxB&%=dOS3Ӄ25"v ߫F4iUSwc5 4&)Sh@0+7qw3ʀz^ L!#pj$mz{xl̰Ƈ+U FZxIƃ S.`BcO Xx֞~VLOΈ,CZeQ:~>`O):^@~O/ f1ubԳž]6a G5@P,X] 3La!RƱ2;Ώ6 r2:s@t:a_7S%MmZKg>, .X!x㖵rsA]MI"3%=KEM5(ځ;1" %}̓Q0?doxu(8* N.OGo/ɇ;ҌXݑpkn3x+yҹ_X bYW/`]^#0<5߻R%VHY0Jj4p,vPA&/R$ȹ3yL&gLH930ՒS!LDy $>RUd5S XcL,͙Ww腉';?C7ZN_pD(r-EF=qvH{ XK -'$ F~kaȮ{T1{ Yr'7\AMLMFyh>g/"sž~fE8d`FR>윷F2-Wr% aݜ@";l%4FoBpH=Āy[N+N596D^w6u!ޮ8Rao^Gum%e޵).?=n?lmoگ'fAmL$%BSBq3Oq2(BMDRK{AA7+3xovfuiozP%RƜb~Q NwѻհLY?cmۗkQ{(+Vcs,wFQYFNapAD J2浖 {gba+S:烙Cm18; ޠm mŹ.50Ohq އřOB8נ>֏L#k!,[dEyT0UaE!Gt/w_~tAps}q2 !fOj!z@KpL0Y`DG@[NroK$>CI՜r%YN0it jGVj*a[Eb& v& ?v7kXCqJZ(.O|OOH/T8' e "#Ll)L˿_ȟ`:31VTV$d.6F9Xy_K ̌`N8 cW=ehN3&Y0xBy1H!08Vw0;-* $a,/lYrWA8ogW}q0cWћׁ=7X:kq ɹ>8GQꄜ[یwr{:GXMLܵRval ?\}hE1Gݟ;c9] c^nb&7˃8s\Kjc;` 5TKПYg&)Us,)ʨ&{ƙJO\\Hv -I+2%I%n|;vSSnM1Ϩݎg}%Vܮ[j6$䕋hbED%Sڭ)9u4n'ڐW.{˔=φI"q @?cU ddQD0Ch`̭S2V}Pi袬EZdi{>N۳/RlfL>lw{={u:R |\Ϟ]cV'Tp<>Sv4~0e8OiQֵՅd\s̴Ro5*KE"*\׎Ue nwJӿz?쫺5wKe΂u0FJس@G qn%F;o"$_q:sИT4sW2-M<b kĎ(+5k5O|ەL$1JW JwCd(0)ar^d-H3YL:(>_?eŚ(bʈI.K`CZ8La'\{nvV֠AI<ՂjԞV? 9Y-JFۚ)t4ͧ( t< =(Cdۚ hM3_ vk ݺg^deO$La#V MTC8ڂtQ* ]cm00aF/s@A)'HP6>N`DA BXC=ǢZ`j3A3@@`C26b'BJbBt^`6ČqC c84$)Eic*X9- H7,t06ĬELb=APX n Oqmd%\%jsmfI3p@WR|r#0]2F-Aڴ.F#W68FjX 0X~0w6?5WB w,Ktƒ>HZjÍ}MX !2^Ǎ%A! \)hIQq>x(깱|A@vu>30DJA΃58e2R'*oGK%Vir3=L9l܌ޖJ&  'a :5{>f7aXL>/KmzCa2A+ߜ'6aGZ&@'W&5&?|`32<;|Lܨa7Y#|/9P<>4U.B/(/7O (?Mԛ9[Bk@DXW0DBL3.Yle3b߇ǧ4 :k3[p#w}5akSF,#3JCkVWj$$Xp .3RZCAb0JpqǂrJ9Bč{Jbz7)FpEXW5R}yV9`^9 ֊V*'l+5J\(0iT* 4<:NE$5z@Zq9`di{RR2bB8ƼMBLq&1y,>AC(ju 䂄4OOk%P9\11L<+_Jeu_j?Y.p`{iYNqwk'A.L.n/K?yfpU 9rwiT[';?=d/?O>=9?[u~9/a14B Ae8 [ Qo<!$<ٲ6ͻHP[PyeE9#<-4pTuFep!ɪwɮulno<ɜ\W]Qm?tԇم&d\NUK16fZmR/C'ͽmsr>^0$ڙ|#M[z_+[woru>e\jѯصbRiO`R\>\#"M %Mir3rYT@Җ ^ǧrGR'dh[PP\|/#UIƙ_|^0Me۾TKI@xSzlYpGS[iY8ܲ&sV2v>l)I&lƐ?nȔ*)#ς)s^SnB%I 40v̖ݛ- /]'WYӏ "5nY/'R8'`rHu=6 !$h[E sR֊@*$[@ G"c&8=4-yi"Y}FIswa}MB=^CS@ْ3kOp:RO[C_7jwgӤKͨ6h\A.WNL^t 75kyp1Ac$Jfw3Jv'`'pĐM*j!.o~oGU݀6]#Ao]. jx<06S~=i=:.G;Ȍ\xԴsj˴mp9Efi" om\uhۢA#F\-}ښT{ibL[ . !Đ\fwFa1($I r{@KjoȘPf`!Bʵڊ(C \Gp:Ad%X.hw$Z/PHDԪ9)jXg\hκ,1'+-h;@U %1[Eѷrf2}{LN6DeeC!чw?ZwBW D3_/꣸ӳC}#b(rT)cܝ~9;^U:p\ƔVEa#KȬ̤Q9i7;֋ 6úOPr%џכZjj*N `RuG>vTShP =6w߳ӛ@i<:Ӵ[kDIȎ1BHψ>rYH69Ǿ_2_q~p?̬-c7K%(8dCm&٧nT!uNO7Ֆec^thTR᪢ڻVQyK%7dmÅdsX aAODsj4lQE)#]\[@ aӶKx`$E6H2OӲQV䘈 ֦ I}J 1- 83x##1-4j؟Sd-%Hs@L(^hk\ PHl_$conynA=uKqLh[S/n-7=p\PnW4]ཛ OF%}'f-GRIT4!"9iM6O'OVmiӌ18{TB&IݫZj*AõJUhdp-7&>/1a?ƛxe?5*YL{Ś;yfܭNM٩} Q#OO~M_N>C6xz7{oO?yh 8pW7U&ʕ߬H9RPa]]Grh[$L5&9h=Ralc>nv%mfQAJ[V;b4 #LLY;b3RwϏi$@M4clyܠD&{㵊K.iW偵()<2t,A3x7| Ҫ; 4֑ QU' tkZa!بUf51Q؛Yܔ#(d", +ta9) fj\ewi[;yp&ܬ?PQj8>\Zs57 OmU{pCJU4]uswk. 3 ^zޒ WDwz鰴CIs ҰQA]2ԴsE$=4p^AF+,'8\JG@]l>G[/GsmI\S4 h.OCC݉zHB˓SL8)a }L= !IM1~V7GeO`›O 6Ed@]>{*ɾ7И߮jHМM'2lbi/9ue֠'5F1Ҟ5")|oF 0!g!UFBiŭ'?Wѣ$\g7̩_6W.8|'|ŗBp)m&%w\t' u64^ 8L޺rE,FhQBRF[/mHMiv2ez;Lג>oOzZd|z P(#UcG#[=Ϛk&1Ձ=f Z~݄Λi&/.nޝ߽+r_:QCDusz~]W܃?`IUFy5%jZ?ޯ~qoNK:;ɧ!m2Ḑp6  XqDήu׫_޽]g>Dp ;\&_DGrB<|M{>-$lckZc"@Dp⒥o&i(37J&+!'hA]{o۸*Aػ"j~w/nvm$0HLuv^ Pviɲز$r8ÙdԜrW`}`R-OFZ'}NzE5,@3.U"G5)2rqґi``Ҭ@Ӥh\iJB %L^֪w[KSJP\ljTcQKzX|4\=zkkNIg&N7` !5UXyc%rS'L^*R(L W#QMs2u%Ε/n+`~(6I$qF-dh-jQ{~kK"6Sg1wA3O}6>,A1XʼK/+s,.҅,nO,MtYͧ"7/zXbҔ*Eb5,Um #)OCF܃m\4Z Tq`0xUӵ=+YA9nBxkVEjbZ{EXY3~j;. \ZyhO](AƨU ]N!*rnz_k6e8ּc_ǛMs }KiЦi_5Kbd4Ɏ47Ì=ŋn -\\+󽱾{~ח7>˛fjr iRg$W$#+*ShCGֿV11UM^t6T_]]C5.e>ɕ5+SW8}b<6y9PFx'QJV_yt)tBdi `o+;d%.˶ԝz1Н# $Oz]Sd[RR6w-Z }cɺ&k)~\;%o&Iq0,- `5s4k=[$WD0ZojGp)g`?|7_ 1^wb4]{|Y ujykϬ9 2(R8hI%P Kݳ16J5^NW"TAE峬0V70<4xɼ/) *|(92eH86_NSh}ڬRqrӸ Ÿb>*Fn|mHLj@m9u؎_^4B/4'3@}ʱhD *M(JVʧ𖃘qx$¦&{0Hs"yRȚhV~`P%xʯQASYJ'Xf;QOɌ:\= *Y4۬SdUʪ)X_U,X()Ԃ#}X0+0 KCDCDCDC zN9i~׿.?QYKȍ5cZBu(+HIYMYqiO 4aA'Ҥx0RD""+h(5Zs QI1X2jRe%si(HUAVN),L%t:@L3sb표o[ $9QL̆FJN=JE @hE큍%{>ez.?FɸojiN$bxES . ;-$?Et?ptF8B #iP oCZ<k-ZLQ)kcߛvMTDP`a EWxЩ \gQ`aK=&B=nR LQct L'%~ #TT.TEgw]%BI5R\RC #5\'lHGUJ C2-2 GAjn8xaʀ3$m  eRn #Ԫ UBU J-Ԝ@ p]R_ivlDddCo,T ŦWH _,J/ zΐ-.Yi`g䡘pF)oq$cr \af#XiO-!,Aia_Jj$\*\S-/Fg#uzNX/ :a|@GPŭ&%Z %\2 qK16`8ЩP5Sѽ%M8.|//)&soil#u_SՉNjLTox{:GQHaǍah^ ǻЧg5OOW/\|O*_djs72cߡh36\=}|}w Yz[6OGnc?|`s@Co_^wwƶIӶ?:3]?]2pǝ ZB]fk`׍0gwUЀkf|8:HxCf;fP V \<Dٸl&py22F!"ᓏ(t|7۷tČ_:&/o;@'wU񉍝c=5xth2?LƱݳCm@OU˗L}г8:?e.TɸrٳnO\lbFl~WeɴAb:-gh {u=Ktc7zƆHofօ8z-5Q{ޝϾ6 ?v[2,`5qyޏ@lj?23[.^Cc OoR=66at?l~n'|:Kwهћ]> 3p{4 <~7M'S?SU݉3٬f<:>n1rϾ}6:{ĺ!}uxj .Dvݳ`I:'_4f ZkƌH\+nEl܌LJ2Ir&8"3:}!?̼%{%^XTH$YkmHEC 2L߻K^/`l^}U(b)z(JËpDlp,pzϩc8RԻxtpߛw.X{86Ec0 o\fGwHP#sp2<؈)CKCHF2C`)Y  $OH.H  ,Hp}H_w% B r.ʹЪ d!f0@:ͣ^qs_gWW-(LwWLxxk.΄å{}gMKG7ߞ惬gwŚ= wFm{hh>AOTeCqFQak3X'y$IuGb4Gb53[Rw]x5B(nH(.ML:hL2AB}qfg^"̌k~@F0e{i0R< bp ln7!W뽱++u\CRfÙ69)i5b'̽a\#K_9׃9HxW/޽9~>4(W7B6^To^ w0?"r?qvL N.W48ݯ̈ o+W̾7ջn| **a pcjd.}DΧeҔS'K걒ۛ,om;=kL >Ou}R`x%m3ju#DW#|f3[(w/ TR{;>$?O&ax$ciD<2' NTFxJW\.plV9BЋɓ؊J"a&`U΁\*9ny#D@@*(uRgܧ:c-PꌥXꌥXWJFBg# 4We:c_mѿ*ƨZ- NgfUysD/vupyݾ d*!Bd"bFg~~澽}7|{{7;熆.n88UFi7[Y'Lڔ0ΐ',)qm]Ŵô}ϛW,תr 5..-Ѽ)Vf VqƼSdy,i&?6IO@|V!$QGs. #)C6y_,W(wa)hA)[@Y~7q+_w'> ˡxLz QV2n\uĠ VVZSɤpiS-m!Y944Q0Q >C4a$JLAI+C0o}^R&*$ I4VѰ0+bFQY4xXg!gL3yG?dB MA??iUvZiUvZ5y~K!p~V )X%Xk|*dP)Ax6*0]&ˆc,GR"2=A <м0TA68[@ׁJw: ʓp< ʓp$6:;fq5' Twyq@!pD* OzGp$(o 4C[%2ZmZWZXwhC J`R y[*VY4ʢUS4uʌݠ$Z3IQE-fK1fdtq~9r/*;d VwKD-zo8B{n!47@!":C1J14t޺[bFC20ʤ䌳VKɨ!`5j (U8yE!*  .3z5^#;C\9UC\9Usqɽ+}SX7ySQs]  1#!$*i5C#"60\D1ep.N|)pCr-?"Td빠ϑYbDaI@-%gF@I<)Ak[ 39nҜ`ۜE5zD;9PjYTwǁ/ O5'׍Ey;}r]SR r 7ɀfkR)z׽`{=]+A T*NhQ)P\0fa:9*[~HK%`+]L"jǺ%ۇ0B5dx6&Z'wWo6{P eƏü_'$FKZ;6oi1?Df#b4aqwiA.C3Z䭷L޺}DAJ.yNc{"oQ48Ũ* Qt,%A(^U^LK}pwbɁMMWٺ}r1Dm\h XӆW}0tkJ>4r %2m-Mh ·uD4+D&*M/ܤݑDe2-QhP,M05!8Q꡶ Sk\ .M(JTv(c HTvg>bgKE3\E#!쩣>&*E+&HBKr\hxIT_,9I=n`{@>yo)Vj MR(BÈSX,PјMoDV/P`yIT |&Zz12nl=H.m_U踵m+?::b<綳{BV(DM&_w8mx5UlL%I\$H2rHP/ȨJ UCJo()EEhԋ>^Tԋzԋ:*])]f\t:i+NQ[u48z|IVi;r֤154hո~\j\Uj\Uj\q-UxA/N?ek VTxW NU +n: I!0=BrHтS{ &&Jwo"[)SC2)eA2LNXB kCrbAyM۱Smal9^0l -̶01ET\GH8jeE$M N(Jm_>{} 6]wݎ\ @sM".7g2O,Qf1 mR04ֳhMd"IIjc2!o4nfBq '|R[(n⮑.!U`~:(Okw"ΛRO+JW7!"9n/7NdK'ȉ@48$w4(jJ*ݡKNyE,S[D\HHtй$F!(HTdAnǰlS(nDq?PBq -PQ|z@I /LY^}+ J ߠ9 A;*Oo/@t > ?JnܵkDjaj[Om>\^%5,PCW)893:R,eDF[ŌIC (%j;m -vg~*Go -ԶPBmHm sN8kԖ(+7:bTypbpKmh$/p=iE.0a;4z]BW#.j( %asx`sD+I%ƆOP!|)ܧpi>SO>k!hr7oꗾ|sTwV6x~e.>W-^^Pl= 3&i֩Wg|WRşnY#|Mq)[{Y}RYPD/ՍD^V\ZQ NRw4q@ 7 ':>3ކZ;Q!oҎu2K6?G|*Н)4;4wW|^Lwgy ɻ޸#W^U-@b'XM@ȵDȎ視D;}p.=f3Ut9`_w{-B/WİO?߶jwHi$]ǜn2R>7 i,WTRh SA ZOMu֔\̕Uiٚ&Zvk \ׯװ[A/v`wVYKw92e=Ì?|1ȝQ*9@PEu)^Q,[8IOI_qylӿo~v?m?B>wHϟN*r6b6bŋ–Czpw_!WZ+v&a[|bs 4۪vR*6 B.7?WxX=agW>lw0ve?r]ߤ/٭>Gh5 i@ЃKɲS`=Q P(4+tvFdu")`6q!|YfWEPg fF!aG GQ5,h3؜:[e RSPXgeah^9JqviŒz*`yf%zrҮ 5Ƅp~,s ¸~ S|}u7zqZ75!IMyIY*+yU[]6Ńٓ[ĉ}%s1;JNP.8 ®=rϣ3ao rl+1{G5#RSY[6@a]C,'R UȉyE9v98oS͟-!ܖlA22#̸ZT]X0X0 [x K r1JYߚ%g"Ï5qHF܌ji# nv1 C,@_fuIQ &1^A;Jӕ Bɺ:Ttg2Ū-5`fd^/ob&8% Teɞs1A7ekΚ֤\6Z@@"2+cfeXks̼苗  7F2{Ϝ7S X@ÊhK=ŖؽI~催 g%Vǧnn0661i]te*y\AUXl Oo˓V `%'_)) G_ͽ[n3`/ HjoIMTMo:' v^aG8^kL:vfZͩ ^ ,.5z7qW왼e;UA٠eԩBг-7/kxwtPh-A%F\Ш7L`H4&a4/Y\{k(xi2 N ь,x 3@Ɋ$:=Րsy>(?wQaǠ@b)ri2<MEhU=~~@~=Q}g&B HchQI3!F96W@,"nm tܥ#tN*p G}HA%#O^"5["5X?#!!32.#a7ѻcczV qIϊEIOD'twlPe]a086\1>N/A-4*lU1 c-q#x TtBV s@$;ťPdO}2fnY8rLBQgJhDЈa }&9>ͤ1ZJڜ ;Z:'n@rMP4;6&/K.@n糂rax$EL %L峛H|BPRBv2"I̢ґbc0{ǁ 5Z&2me{1ƫ1ȽҢ~ϝ^fe?k%rP-Aބ%Y}iP5>Ȩ8BǐdsTɽZ0d5 ʜsG=E;>b\68`]$VWQeuEd3?H6˷l|8Ӌ-q-[9_X$%o7C"Fo 3Mgc_Z7"K7҈ĚNoEGp"hX%wAc^uϰhg OFi(fy3 :Hl%y Kc:ͯ.\I@[^& 3tj*=V0H ]xpؠ@5K tts; 'V.ڊQL6@/Hgq"0G95y 91@ʦٜĀdJLeV]wǍ!P':,删oa@),h6sd =d9f(&~쎀 4&P񨍖K>(G'ӘJaFEbt*hzde?UW@RRn,΄lH~Q*nCL)iϳ#u|,l2O%-4܀מ~3WCocuERnZBAN.xFiTIt-٬ߛ^8q%]mXHM(D㑌hcԘ6VQ!8B%,{_K\F\!ob(#6=|) E"Pfjc{GF?|n",H?יQO?ly"c6mX3 MCd>ݸ$r79NrM~!ꭞ۳o0ö(l{I[Fʓu[r슼[} Lp~ G<ϰLoAllDhYMj<aU*FV=3n=0Yڛ5n|K3V1P.*JЖRl9yzOWx7I' |ܦ4?'WPdrHlb$)r5_su-y⬔rR MZ)YUp[ޘ ʘ-GE09:*ô&[;\̚ p9VyLxqe@ΉH{] §.|_;:ahrX;S[9ZYl;QA` ; **br~F"Q ?bfc]+vј#}+͌[yZlhWTުgDґJ_$}{|ަr{CMa헋ߵIRJ;?b5!|wO͇{aam~⫯4iCdý_Ů^7}/ߴBA&] 'k5=<I/;x؞E3홇ȃi}%ɮ_Rԑ)LeZ%f`D\֚ߝP-uQ?~wo*if4͍q-!DJ3QlOU{G^ ?ztgX1s5oBr죫(;MYMͧ˱E&)7SN-fȼt3vh3t$ҘDaoa(Yg }7ͫm(.yE1z\ZٗЭ&i:l^҆1I0RmHRB50J%D4dn;gΝ$IPKls*%(L) ă|}`h!2K(e)Drh}wYmfpoFE[0GscP"ҕY:u)Y)wKG;txK#JDU8e-:f3ME8uFjLf}(% VPi.ed$S9VWf2"ʨzNߛPEmӗ|*te *ץ !CҕmF͋4{;1̈́`LbPbJnlPQ[بyYwPDmy$f%ZXt"G!LrvTØr^u,FM }16,-2P<|2RV*;+ F&Y򐧓o/(KFvMOz,(=诣Ƀ-eo!UtCםanmH,̶/ߟgM|){r5ki>'v̊k_ӴtVqX,&[t2,9ϦVbR\ίkbq}JwB2EՍv[_~~:PA)()!y|{*ZmPw:j[IN0{ZHQz}(٦)-:3*`4qٺ?}ٓoz:.~Ց~>=XRInTyܮd4\B~Ÿƙ~z6ׅVue+U]٭ꪼUFȦ 0w@9N1BSIh"9+xqB) Z*%TfbL/{}"kd{gNB7O3g7E>8o2C(耝i\-VGЊ+1+z?_%@L!r1Xe U(pþ_[H";]t,Ca13$Ff0_)q̲-t6V%e6C]I N\Eꒌm'Y'j*1l(%PBH_OI5 G= aڎ?)wWTy?,l0jk̸qɁ@x)muA#"3-TRzGw0 eAA="loTAE @L@³fIRoaUx.cxf aG7Dxuw_4Hp&(Y.d_SZ"yqP_zpgE¾$TmpLGN o=4:ͿIDHl(΄=rK5ç ,2#& ܣ,ݕ]ݕ]#t,$DIi8Ml<$E"UYۇyBh-:)Mx XATf)DHyL+򏫑qS֖.9x2BC:WOkc]NnǺX%kSyJ`6 xt=6Ēn$^-+*BLu[6PD%V uG~2Q#+TsQ738wwLd R%i8 $(/IŦUrF\x߰ !ۅvwӝyTLn%w`f>n#9Qr{??Piɴ/0RIJܙbr|k2m$ʍCA㝠)WG3JSv]@=zHR9 C+pSeAV3 _ª`/_޾||}!g/Ϻ CJ껅AJqYN b+nx+YDWo}wZĉb l.q3U H(>%Vx ;/2p!s<@a|?eV BU)0OBwAO(+8})F9H>#1?SA SE)Yz*ҍd@J;g /S+/Q?-!N{*=(~ɹEI0%dRJ@ud "i 1g\7gd|Gs"_w2KBEȨ "Qăc|lH1ړf8DIƳxƀ(9zx7#x. Oyϻ8_+_ HZt¼O$W[Sk:I?9լ`KbxX;[i_kO"lg}:kVZGGu-꤫;cw Ž~ϦjY{U*V/(.zVz6rWz={B){S} C}Z ;b_Zwm(bL oo9Go=T1*='7qZ `{&.s@hEe #ʖ?ONjձhvǏgN^OkKU?/s7mMY4 (Ph0b(hF H9HAiJ3He,Vgz;zR_o D]`/]`/v]k{5mRz} \'wFj WUGsH5hR$ w5Jµ{*( T,L׿>[ɬ8,r"c)L&Q90{]j4Ɲn#%QTv5vHQ&= OS靮5'__\WŕqUQ^P1XBBs{ !DOm J14 +oVӷ}4ۖ:J;X<:w:0V4@1s(%:T<ŃZK.0> C #G,z PDJ3E W8+e՝=x>aP.+v#.5J Xd擉ݱCnȖDrhp+a[Jo{ 6엣L/gvLa'A'i,"ؕQ[+Ɗ:wN×rC v# ( bX`1P UUa ࡎ[!SHA7؃KļFvg]jV *9BPV$PSȂ\P#%`"bك0H*K(P]^Ae#D=1 *r3"-w;$wļ/ao"tJ~ضuIMDƜIKIf7V2IRu-\킮7\Sb@b e)]ca5AE .t[Zjħdp-t3$UPs"@D0,zUz_\@srB:bz{w߉g IYUq* Y4mqt`*q< AP1=5 G2V]^g zڹO8R*˜ [Ur؂ qɪ ZK'\.%PZJYHIiNʂ7`mף~{!r`Bg5JP H,D 7W#yoN-Fl^vNHQOݥC+X`ܯN'dНtcg/sJz16&LnplIMc1cONT;\B}KZzFPH_Sb*%EhVZvh 6CHH 2UóFR()?JR@*6_-e8)jp@DKAq-DfTi27X3|71_iN([Bn!EZ|9C}}uGi 4FJRg}#7 (&Y`a!lE+Xr:6EVF}u_{~J<=&QDBùW"s ]2n-jY+a"nGwkKUݶC&X6f2W4k"أUyQ) &ƉIg_ZDŽjoqDd0ֽ K!xtJ-))a`F֙π$He:CbD)P׳=kdK)\74k7.6F RvK)Q>V{בFÆtM `~F7=Qq1DY"aJ0<)1 HS̥>.  -bI P!ۨ86XksEfJ. k DŽdiSa] VUu"0癝?Fjxuۍ?;N=:PI`ˠNS iPOBI3KjXh5 >-8Q ]1\Dq]Ǖͤ aN`a{o" }3yDz?҂JH&Šs9%;HrW͑%Jzo `3՗S2t@ pU)g<1N} q_kC\%]2& Tq,ޝT1aqKF^DqBQ8ww({VcOzE! CNq?J)tgZxƕ_QBM=m8=۠.E d~ղwH!;vvF֒(q8Cqq"p=_FJ.ں:佻0$b[[3Gp̠:XO 0'u'|[vLiYKFJ6eSҍEn'.K="Uy᷁]6& {T2äEaY ӛޒ+2)yIycX`~W:c辳oXW* [%w|Vp }~qe]S~0=d?d!ΫZ&g6"~_ZǃLq~AI1*E{I*My0ܻzbkɯ~f˝K~W@*j.oz deP5F7&Kb-Vf%=mJ;%7{:,%Q#]|Nd+%I#%.3Mjǒm8"MW ޕ/c;;W,|=^a7ҿm&+=&5M bMN.Fc‘C!| ^S.5Xbۈ&ZTc[u`kH>bUV%]yi-F\+o'ntM R2IL`if\,i6q)L<\ѯYwv<4kdE ~xBog$!!]-kԥi)v7W/}=Oiٚ;jBTw׾fM]΅TK׾)=+)aȧQP:C}L_% ^کG;z]K+1t$oa˫>U%Wk/i2IC(F qk_H#0 F%|e!aЬdJݣ_A[C | #IFCkj"9e[}k$ c jq+<ϕI`$CK 7S:Ȟ[ '/]mEGRuXLcmu+13ՙ:iNU٩*ΑF;b$Tk2IƵҎL ֒ Sf8 3Q42Ioϻ`D_ыlZ$g5g֬QQD##&6 C0J?ڕ/Y;C#N<. ӫ/Zߛ:.,oDJT&6Lg^^*B.)tАBou Ƅ(A?h`h /U>GG $M)y,[[`eaul!W(C~7ўZcnيh&%{Q]}Zگj}F+$P~z >S1j,SN'H cu';J-H"π3D 2iZBWήYmyQCΔP3Daɨ~ ӭ.~zJMƠ cYͬ-28e\刲rǍ"Lˁi Aq}Wpdl4@xrJ"~#&;߻yדMI*?1'S_~/67ghVHEhgq EĿa;>h\!z%dN*a18{6kp';: oiw!|髓RĀgڭq 'xRu@.zs_alTA|#!CG|4NBP͇m//p#\ V{_^=y@XAvfIwq O>K3_brrK3hN~G@.._:k7&0hqvx[N·r2Y=A-ym_4=׃AY0sgF:^?= K/>㽵0r ӊJ\gqI~❧#pT;Qk`B*@8담On9]d7;贺1)7h8Go!-~=6LB?sĠeX}[N^ϭKGkҚONo0,e =>Lp|YfҖ^/_Fgy|+0W{W ]=Dr!19nuòx,sTgeNRhjedX'u?%imFE0l0F 16BӞ9"xh0+ATh nmF0!ӟ %1Oיr0PǒW``zQ)=ރ4 br (\<9@gBYO+k@TCa(Cao s m4.g0 XOS4˥ڣoM8 䦊Lx(v{G{qϲ,T(}6*(0 1u~9)M)˜| (_JcE:\T/^zLܷzs^gs.W%x&܎+Lxkjv>Hn6VaQw`]+=[EѰ?v=:{l<>,g-}4yb'O,\O'%~{tGj8:\Sd})'E?/z+50.-<w@OGy?> 軅(? \ Wo--DVaKn)*a#&< NL&돚%FQLD P/F`h߽rбz.|9.$%ep EeC,kÌ4MЃ+Mk׈vwmEe~}-u=26q8[*Rv+u -wpsŮxore ?g}doOTߩKiSktIXhub+p"yq+bcɻWОԑ%oOoOoO𢡄>=ۓ=[A7eO$ 1Ĕ$.ΐ8ug)8rAP?A07ֱJV"~aR Qtf1̒3Ӝs""˥ ?#g $㏲\0 ױ+*xvJ ڃ=ȿs~]sL_MyRȵ} S, ~.9xZ΁0r>#=9"#=##}!dYjQZ8aEb8 X+b"8q2H 5Kp~C.j6C*&eSYn=p" H@ SKSDl:Oe.$rsSezRDjkJ]RttV97|4|M{˳lތ۬xpuZosF0z.V<0: ; mD/[&P(1we{u*~t?GAE!CLNfjl[꣛+ރq]je3F]~-N\0px.ʋ!exS*tJ INɒSTDiIV͋jF2Q* I1K5N".0G$b&rj״Tòv^u+Z+/ƺ߽*⑭ykF=؞-7qG ?]DT3٭/y7d zm-VE]G< I VuZ(> įlL(Ln%9dY " | beD0mJ[CVJ6"ii! 3,$XKBSəEbZ` f`1reTG*_=uIl |&b&b&b&*3b֝~2;'b/ESJ(ܯS,=q6s/BQe5f"S)LC"0N@|<ˌBTp/(uYjQ֋EeɄMV~bb z+#0LFZ Lkk> CѩsQVL$ſY1yM1!F W#<-fٹmizǡjD[P J$ҨtIta X+FzgO0l7㤢JMߪLTMz='LR⅓,FH RZ""JxM^79w۰c^F(fR[#a(q͞NAAڣ-1屣,8csívB4UnxE,Yٳ(Eq?[1!OQW|CJvWtUڰ}0>r#F|]d> ߮aoYTĴzH5]}fLZA.m~t6mIwmjjPGvu]EjsHBэh"0xQqNdvjӒJ3%D*9JF87bv 6+:iJ*hûC|Q0g0,"0UUq:LXpi#2!;sZ8>NVy^t->F&@IdPDKI!"1hUD%y%'cT[R2lH|D̈C%ϳ H)|29'⌕=N:P%0RX@ Eo |g g{{<:lj҆V꬟ӿoʷ7x(|`5>D*8ݾ?[}6IrF~3eфRo?=|g(L&FM.r||zf*LHJz6Zf>ǵ5WrLPf[O(V:pa E|Q2k!e)Eulm9`{|6O ެ \&'Y6ێX7j}kNSq]ޢKvVP'8+U\JR>k%Rv}US_W:Rz*bժv>j1Vdt -UV>T,sً8w?YE6gBp!,}KP/*:rZՇ֠nF"c߱9%xhN|]yBpl Bid?q<.C [UrjȈS%8[nJ|焆;ǹ2 =e'8ӧR[Lh9-AuĈ {v,W @}6fMz< \  vWBOAiʔSYrY[c9 G>0*`͚R3!Uނ۸( =6'j_v$Qۍ)8 p4aA,MF/ KhMisF׺ }XtnZ.:EMJ!rژZ1"472#fRR4]ԛzϹ/P-ր^a-W6(mPſ}1 UnsDgi۴gO&Lbv'?G{{<ْ4,88r|1eݝ4:m9s R 9`cCgu^r}&srySy{"] @ir_I^1mh#td ʙ\#2McWsYĘ(9'O$rKʉaOBR%1B 3Ꜵ(cx(}Sޛ}\ w-$KvQ@?9rv]S_(thij=iB4:{XKT+'dPIr?nޓiQ wͥl)r=~Gwև^E^Ւ^5UJIMƹXGFSlpg1,?{(hX \ x&#!lJy?_W~ǔk?-4D¢2+{Cp]߱ZwڭQ2jCFgT~Xd!83viŋz Б~A֫.GʫIy!܋?F—a~q=%opM߁y։xFq{ҵuc7qy˫(論Ɛ\\D- D؞+w{y2AjUC~ts`z%Jy.+; ;~>%df9 L4C)^}E,E}&6eeGUC`bOK9OK5垢.Cսp)NvJF=%pa6|!ۇm&>2pbJ4Qa%0 L ω {(>Q$Vqw\j󅃡JeޜmSy;{e{ыD>: wxUe/r$2ڬHȤ3KP-m𐮠8..uA+|s\2v4ȟyW\xѦy2394y}4ynHAGBuLDzz&RyL 3:rTAmX}O՚k9%G͔Pc:lץc(P<\ć,㲌0o9QDHSMJ^qMQ4SѴ ܱ?5ҍ'dpu>.//W_0[ZTVG;|xjKBk^&?1<1;{!袊FH›ϟɘ7 *6j9'S+wǘ9jtLp| >(豮lsP>/fT%ƫ&G&h\xJ20Į1Ӽ>z[en.Oq52ܕļzQz^$b-)'rZEI=գ1emAZv'!&w9xя_F#r6+SbU,[lm9w"1ShZsr>F1tƕǦyE| 2\.8vht=w@sxQ7mV5=I Ձ#YVq-^9n[=s8Tėb*P{s9_{s+?ix0^ QC_}Q*lK4@-(9kȼ%Je&8pu S-aV2%1%uӇ|ДD1Xz셝X|?^&Q/5Rx)F{|7r-NoڏvfղZ U  8}|KqʗtA+q^DㅲEmmKVY;P/Na]|ԊӨ uDf5cSvbI3Jz01pE˭ ‰?#>dZV:$sD&g紨RCۍO9k)"ɂsZF/J˖b|RzJX,}0[:K\f ww5:O3ϭ:izcu;rX-_D0>xMCOp{9猼&.F{~SWG_a$;q@G=q sC+W(G.}+a[5:7'D,Z4BH3PQ*~~Q'H:Hu̺(ᷝQ"% o^O(A'0`=n4&QZNd`"2,hՁ@蕡qL{P%p] ռE1wj 1n C'Q5tK䙀6yI'6LSkbZϴd41&G#H6@{!&(^6zmFA/MLeSkWK"1Ts(A.1LpN \) = z$y[˛0Q Hhp 4i0QstãFqVFD#N iJONIK b̈Pe[Ҧ> CkvdҐ\rZ<1| 1\?=ٜس@ 7; SM4S>_m8cPI\"vv.c|$o..7|F{ؔ "%RoraUG΃])7Tŕ9\t3ZŚl;ܸ,g~5M8jjXEPF,,jâZ5@QS%J+^#YB/{}S9w7/ D O5IIÛ8Þ!_l5_U׭v4tPkX+iZJD҈(B4眣kDҐ#"Йh,ist}'F+jYLMA35Zj@+_ =KPC? o϶ t%9ȍaHL!C`~|3e1LLd/`mI&n>>XT_V%lM??F)rLe/xqjщFJ"*\wܡMCh[ì(Ae֎I9!=χ)ϋy5_9=M.t10T#00d`r+~NÐB&2zo^ojVb&W=mDHjb=F&)\2&WL`oPtֲ>hi# ]84SݣSIjfqCwHi&2՜*KC8SJFG/ b1͐:hXV:瀜4kd &+tpTOႯVpvn܂ *Hp%KugRǛp*\P>UD*\ #뾔nA1"`{s2ge5hT@u^6?L&pT FF:S܂Y4bl;.& ҅YjfEa EIAG[mp$wFMqlC!oɡߥ]ꋴgΫumh ųcIhe ; s^35*̧@!]~B Xj6}kK0Z< s=vfljքr˼xgaBi%8afTQ':,#DO[UʨzJWUgƽR`x=kUGք&6~o1Ƚ$p 뻇IJ@ꩡjSǓbr3զظs3j󌾜èoXӥZ YSeT*ٌ;'⧼Hm0F۞} ]_MmC+g/6|,_[ P+e3\U tvG˵&w|= /zw˓躳۫ }JORtS*y;+Oy ʵ"v؊,#yJ_%_Q&^3p$q)|*UP~ī>ʶ3'swvh/#^k`,*,B]nVa\ӣ[f&0k+wڝL9i+v%ۯh= kk.H)4g\mQ (Oc53_bQSb 5EatBRZbcĜr.K^:!=*VpwO[V38zF &Tg_:t-XpD-κ=e\{\w[1jKukڮ<г륯%G>0Azro](S]+2ݓ[R*ءv_¡+_*0'eQ;rcȴ:`hSMT4bwON*g Lw-*>~E`s:׷lF`aZDZT3S<Pf JvAW.5YUfe-%Ps<+vVmAWR{%O~/oWtR0H(yjc#FV6\0ű4&~>rBw> 6b-VlXpk`[eׄUԄ}̻P+_;տ{k|w7I[6}Hr5?gP[ _n㴷= 77 > ?p˔sBT-4Jzw?3EGВ1Vh7%1D?ڿmVmbĦ{+8,uxa/a!M8M-ώE:Ud$}M2 ‹ՀOJS\o۳\r}QWxYq΂/ʉoוߥ}U]y( OA" \Jp]#` @2:ٳ}?}d 2sW=!@="Z(" }i(D ; &r %xf(LK+r,P Hg oif'ð#OR@cH[o{);HIu! ?pmg}r9@HBjQ|qNfkQ`gA[_0] *[w,z[K߆B|Kr rCSM ;fBGOӿ}w.A׳~_?Bn ax Z4zjmRy @@՜9 F3%0S3dBWBD?ݧDu&jN ;j}M6+G5?ɏ[wl7jdaւlۦ1cjM*ֲѦ-\CPF]AsZ J0V "`,*Jl)ɐy#ٶڐߣ֧[+%\u:a1o+u9Fs`a7gȁw''vyO`e(l5AӰNx+W?=[/.ߍFtl"|ǫ3kP9`mGdG{Հ37wWS,,0MLmh|4z۳O/i&$tǟ}>&G{?Y9"ajb}2.7L 񟃑I)7lo OvR99*6wT2>yEa[fjaXJsJP UI)wJ}J0w?wk})(Bf?,)1=cRHsl:\ hw2<3\W|n;'K-:HaT|Z~kGjv *yJN}N鿔ܽ nyMppqӏX➝lM?G%RB^2mO@xxW#ɔA!Vk6Z_Ęgzg2=J0%Jy%m0rCzS/ȟ);.]b0?D,eSyxrsÐL$|{gI?y)݁ô!Yɕ$v.-IjwcD#SD NpD]j"KnV)_r QL]s ֈ@$M+Nqޚڦ(w,SbR[;15S_M uNH䩘ILP; ͼ? 3NkCg:x;݄D=C jt;ot__E_Ê<X6!Feb䯲°ó"[%N\H*>i?vB'9jUogo,v'Z*twWG'O2 % f?o}ݟB JmI])ءlW/iC`D.DTT+iͭZcmҶ3L;v7J2S?SEeIK%t,F([Hr(+6Ĝmr^VU_Vw')1'B}V8ʽr ֶڪz&0I]#7H^VG+ūWhjqBv6dWZ+*ݛ^Ǟp" !%]w=Y8k9pv J݋zQӸʝŦqLt2NS)ᄤtMӸ5O}$g$XS)8[E y=!$Tiv(J߸I6edzd.W/ɜI1qtvF/^ P /ÏʢVy֑cU) *TRxڙ(:X7?""}Ǎ"ܗ6OΡHm(A_Kn|O :S+n%ힸݕ\EgÙἤp0M1t{i&F&`;60"/.&}Q켺8Nu (&r4W,R@Uڡ 87[N% -DRU PS"65l)F}UɋbqXqU: QQP !YR¡Tf3$p;3]-0M~5^1ԔV LD?9JPvǪG5Z<}{rO~)=\_VǛ\ag>ST&+ pss w3342}kMiˡi뼅oDUM P% @@GCb̴mƱ ౟ ^)Hcp6 sj5vD>V#}^Vw8ҜFx "p9 urwesRk+gwIA>T[jE%ƒ'ElSI u[]_Z-יv5SQP:54~ kIU'bZ DIxc|N0\x/o1'H&ڤzq3A?ϐb+I%$ghwNoD{Ћh&Bh#qR^,FFVD9xW\']`t. FʚLrƽυ"RX%gY!B ɨVh2ä,fBI5S28Fp Wiԧ3 JYj̄}Rk#j{_$wZI__7<ߗMT0\d8衇*t-fSBFHGmE5Lʼk/Q5%lX)[6-q A]޲ꭌ;{7 Y/8&drS$+> ?ta JhClHGCclUt/;&(}_Z>!W%g]d~7.kerogI_ϯ/ d'Lͷ#8աv!?sq\-jdXz;EmuSpynG**M@]g/z~:B~p‰:|KEɰC'HыG)B*]_TDm!ܒu a*RjF5U+兹FP f4W!9AdT3dW ^,eg*xsmxz>?Y ]v~~u>xZ;O1J=o>Cs͇!r+ ՛VbJJFU"R xE2VzI$(($+(47y ,xZ+F%=Q-s&IWbY=͈":*%.˵g+rĞ% bW9BhE<8`{$U{uW9f# UA@EUܬ c@;EP~OIayt4ˍ*CRKZP%\u5IAunևJFb*5]f mÈ1ǐr8 x'rIJgڨz%%BpB %# j-(X3)Sٝ4\m'ws6)~} ip/(Krʦh'RGge+ӋJem&mJ  e.6XRl|D[L@p 3LA)'sV%DJYf CL`N暕-EDŹS3kscQƣ= o ɩ'L[gAhAxHA@/+aTnkL/  4e6UYmF o #\B+{URy,UZsD(q\dޡIs5c &Es44=jG|#J ^\`Lsi.4ՔBS(7ud'b\?$q'=XfD" PK@0 k/ @.Goooe!~\ÝWyV;WSq24&_.o7'ˉz~W82Z34 8t=:1xX镢T J9,pĩq[U-!)li5L({A593Fk]phNhn$,Kjzr? UuX;C|{4!/\@Zc&|O/yw)35m Jgm#+"w L[YũpʜR37Ξq϶A#C8Ƕߗ"'D69D\έ$\ Psu.#I.okIEXH0&ЂI9B2z:s2n[0)/մ9<Z)'yAf* U ?@DI8)4w'QղxJMa%#4hOl}=:eέ8iFThldÒøQ" {>ya+)<<լ8!(29/pk dMCC57\CmN%LfSV$#0RRȮG-Σ9Cvv̓*fd9)K[Őѭ.Ub+@f0JPcݤRm@$mKWT&x*_\eoYP`paɆjUy>\NJ)׎:j%z%WNΛ%S uԐH>IԲkĀ$TFmD;*}xBҍP.&v;Q@[!ZX }D偮qTX*2JyR]*%Uv4jbG><MIwudCvdO%9Md7ҭS,,|rr?OބP{!@X=g:Ť^On0}. 0e!^T=vmn TɕRrM//I=¯|X d /b"DigSk 9EG:" L>VrbV*s;H& l<2m!"1V k|SPؓ(컶/iOa 꼈n4 {3TX^wi/J ii"QpdRH)qݗIv MDd+hjJ|H'g]R+=)sCnB3<5.5ڙWc:+)F` >7:GmʦŨ҂ =.8ҙLiJ+M59. o_zro\-8Qx Pݝ0w$FJ$T\*ZҰ&"zWbb)^'Ɲ˟*k-ys[6}2f?hp܏6En/"3{3Z/G_^YףTI5,ؘr3UF ^ٿmo,zb̍_E 9A ;TlP͜E ̺PhȌt,(DŽ٘erlrţVOG̱TI`juޕ !Ap!8? & M`"Ǘ_6,Mϭ!1p)4ݘxJ2 񨿺IلܑM279e~)Bn8TϢw- ;kmƊ:*H᎜V Zzu8d.47L )h@`ZBlU\BŮWV.*pN;MQXBc%vepvI_#9Bzau3P5[̯ӯez{IAr) p~&5ns6ܽeֈL7CgEDW5&X=ASBֳYA0)SlJ7T#'1,ViW]vzUUDJb=%ވ N!HvS}X8*9E۾aF㞝AOqIkYO>jkž *H/YG5o/>>ɳc=\^$ >s={eL#X7rZu|G:k0 cFAAW), d`\%!$0;2ZA5rk"+'ttOM:=Ztq,=Z94(!U-oE)u^I.X}Q)R/1ۋ-0PC 7Ok3<}nLJ[(%CdNE^,ٕqQ)oM깮Xm|s#gCiGSAj,5Ү㗛0g^pi,F\ [ڷs)c {ޱ< y[B [Y=Sܘt_˜+]k2ԤN'3xlO% Л54¼Of~RͫJOYcǙ_ŭOdXg%kkP5ì? ˏ٣׮k=ʑ$7zwrO}{~]7uZ5NS\~D|%EH^=w֋fNtbvt嵎X"򹶨D(tC;Vy6K=}=BZ]K]ݭ3q6Lz&1)`DBQ ]pVUX\~=O5Bbt&wI1/ Sϣ>^>PChL!,<(ѽ w.Z- = 3R>F\}{sc'~K񼬃IPt2a479Xɳh-C6Nn'hC#ݙW7I;-_?f{l&4kPQ3-qUKNюTR@u@NFݘx۸lR%L3NA MQer'JNmuNCK0a9ax[c-&5߭sͯ rWϧOUKdb å< fkp5]4+ " B1BV9yά=vDFbJuC^ E1?^*H'jͲo2Fp=iq}&"l^<-#ٰLn{g]Dփ5-"|{6r9RJknjxdg?˅ƴG.V-.(뗋euLT`bOI؟z~yu#9焊3WqyHN93!U)*Q#eqܶJq>(s)HvwD'ҔSƚ|*c|w5.V9ecRE(kq8cv[|B:T0q !j4{ޱRAE{nqThA^fq kUuw'Z7C_"H ahoXJ9<;\Br`9F3ÑXx6B3.: 2#i0b f$sV{np(!"<MK*,^ ` (4*:< `ƚRG*`j K0g%`Z!)1'-VҌ6N:?H-s`  z0xRgqsZJ%wx5newe1RRI>ؾ)HQJ򁢇"L`8T)q"n =srr39 zUxfr/m!۝\ʐ8ΖEVeGLqڛ;j0STj侣g  |f0L- AlPX,Y G}4d8O:KA3[?\tGRDh)5BO^*3~3nNA: G$wא#I5B;70jO$RDwmqGzY3{U ГEvH^}D {g\99c2bR:>]ϸ[3N`\R A.jC`I wM'zܻ9W{ԹWڹʙ7nQonXJQL1SY*}۽E2v_54KRP9u~sх6s2UG.k|~%cK^3FZQQCa;uԝvJ ^#Je5Rj֋۸n 8jo.lkaa/{e! ashL>d4 Y8r2PI+U ܴ|桝s7-(up&c]RR\9fHT %9srZfZSѡhC;=`<^\cZmPx#_+rsZZ9y隋%ɞ6=;KI`PKI`Y5iX ŒB@Z) AKavq),my @Sڶgp [UeBu!pn9lwbh p KFHQ{k%W^Caݸ6{$]pjۺTu%Z~\zYzs~Sﮒ+Mo>L̓9oV߿yK[ѷWf n>^Q@31|/0s#MQdw*@ff5GȀQh#xZRȐ}H;!P`4 0e#KusƳ -iIY'&`K3+YC3`YE(qz8-{WLg |Jp2&Zl20úEvY;hŁO1,UZd@@yp1t6(IyQrOsR4I]?4Q0/bBhH!)g24ϘA:ro1d&=x$ft" @:QC B[^D#7_ Ǵ7Rr 6x92z'KF^jvvߺ҂J3 05gos Psò8TԺ9m%Uz >dQ(Bi]L%`f Vu+%OEbs[c, >sm R0<yZ{8?|P,R^bm&,.$r'TI: 4$ ؐ}ŧLl. =g('mF.QU=<%l ·9NkdN4C 8/f 1@>.{r15i̡t0Ks ByNZF'i&jiHK?ix:4ܦg+G?f3H ex!􌺭!ވff~:/vBer:$'~1"IרD~ 5!ɼ'=0g] dYvY5:cT>;4gz\HuvmY4AoBr fdf>3ox @sie2IJzXir1a40h= @(L]DJ䅊1/:%iJ0B0 ɟh\"1*D,w^hV[1+1:0֘1Sw ܗ*RIm-ڟJTl}.*3U [ڝtN,zL fj^KZh,#,E94P9`R#M[+< BMo~44\ ZԪ#~n^п`Tƨ uu +Cdԝ}!m7F޳ a-R\OΆEd'|Eu - ߕM- KY*۳ʗ߭@|o?[+Ńo޾)n/ K*at]ŤP07tJ&!^[ j: Ʉ,4R/2.PaaQ%/=uqݧls[BB?&$F^I\5W)mƋ[?N_~AuNןWig(,HsӘs!UNIHDٯzhуb$=OcſFiP0 'X ^LNԼ5 h)&~Z'A5&jr)] kS -{itNw(Q@ld[BY 9/lȑ EݘBIfV\c@`911F'&A fk@)=\ݏWSH.WKTLz4drD i=R΍mWXqG넳~}W;vr~+"2o.|K ,{o}(~Comn.)(2ÔV'rJ"W w*X:(1_i;뻋onWS7>J;л?/} ܭƼ)u{zc@E3_ jV-]("KK,fCOX`TC2̗ݦJ%Oퟕ*9!7*?*5&jūWZQ61yn1ռxA0ռna"VH^+(^jتA6N퉹 ^Ժ`ҼxsbIygx?8Z<h6J#tC**EKΙUD7qa%&F Xb~Z`q U:fKcR0 |`~&C.4X͸w^ }?ތgiU2nEX~+#kx懇ܹv㼿b/N6Fvƅg*E|Ƹf헫oL_nh_ ^`X K>_D|"_r̗vUԸLQ#!qM9y.O.K-d+mhB)^*œP&h@2#`o;Fy5=L_@'Pt;}V?zyCN.K\FƅtA\6FOzkZӚ~mʀqiH˥q3I^xKuhs7F椬kgԉHn+l4XO9K#2s\h38CZMF&ȴuf::C5 !A]`!$zd-Pt :ԋ#iײzP[MíR7V3Vާ:}:>FyFs;3m y6?>Pi6G=m5~Rg[ϞF[BBV͹&M͸+0 V < O>F9#&=,8MDG@,th^Ld{&h]=& 5Ag,/twXhN1Zd_ߑ7\FQ7/6se[o!9!/w+Әn\5iIl~&7}n;Y%o}xe}W^h{q{7E 989 d~̳ӿU!bm=[Y鮣u 0paS,t+܎wa59wAŻ0j!vNn{SVz!,䅛MJRKޥGDz>5oRcU幥A[_Heofg[aZHWWeǷR!UUS56GXгpBuq\*pdmA7mC2zKWuMp]IzPi- JMU뭙.uX2ǨuiS`fkuN]ֺ鉶774Vfݞdi[S֐-mJ |(%/.< bB -4nGnCV!|ٍ,:N#^2#l >aNCvs&@ܔvԺ7P ;DIGQ w0 lbґ`3^T\tfAL0x&gZb\فZBOmP["K/ "IEi܅1reeɑd;I~%Y#GF t!9$q}F;cttnya{ Q9H@AW2"9C;:~%{J yJ]#15kЦ-Dz8\h%JE[>X LWZ}XH?b+p(#BRd9f2ӥw,9ǰr.ImԳѪVflQ7Ex$B ] d=S&8jJW@#TJB$jr'i#]I;b-kRvj5#bye95Ty7usL0Lwd~83^SdO2/#drn%"Yn]) xOM\VԸdj>4+l[ݩ% z BsAdDf fE%|w>Dhe,|+b+MJufA…&SQuX@G!-{TfZ*ݓy *pU u%4s <1#=~HAQ"qn7W80nŭ;6n}Ԕj8Pw_K{́>,$SJ)]y:BhN)eKsθ2.rgxa`/l<6 VB=&IfO'KfGɦ7$%Af-9!S9@t˧L/JTEyk!4ctѕ[%, @\ՕS?0'NjAcITGDg' h{O͊<3`2N\-}=9|n#=} 9 ޽}_[ƔO[F# L1|r4Y fcRF/[|& `8G0{awgI pzpF9rJ; xwFqa},W*QXaߦ]55A@/B@9g$֔|5FT:t# [dYk{fFtp%xfg>oVBcAN큨hٓw6"`Q9gϨ‘@ ߙ*7RUtc\$K a;W*߲ӰL7UW*0_s"W`ªgt/FźJnžXLƅ{̸ҕrO2MޚK筹yk.ykZ};rI[0sh Ʒ*s[K0Z^P,no?w+Dqv>IU[HF8)n#gsYέztz?^~mt$\?oP<]]=Kn`H"ܚֶSQAͭaT2k0èD4NI(83&sNciw ㆱys"R0>BnpuJ%N7gUҟU]YUUZa%ŒARfTༀYX]n)ֺ Vw֖VERiӏ ` 4-DVxp=CH҇4/}H҇4/!ͪ0E ҆\!iYYY#dJƉR&!RZbE +??6.wy/{]˪[Q``ƈ%1ъj#N10L:)H/>SX\(zVX)Er&* &9 5# C)hra"[lΉ̺P a:;1mT:<Ơ0>v#A J+X9K0,C֔`EXМ`v lKPԀcZn嶍[Mb(܄b4UHJgK_Z98Z3 c5(ki,29G 84S2yD1=k\xL$YM;xZ(؀^ NȐm_@"R6+,AWR(4r`}X wlA(O#ڞ5 hLR)~aYt Ӭɐ5 _r2rC2Ȣrk@LQx%e^A)Ӽ̍@8͚ϓ8+^<ɼ H~Bk >P.F pZOP4sP) B]I% w`4ܟ!5TFM%{|/%K9EU̿hJU[z׷v(h:[ZSKXڗ)Q `tȯW<hM[s"\[ACN Pah"N,q cX7R6W)5ϷK7_wkvx|91小x1t\ .FLnF6'eVs Q[Sv M!Ǻ? b) @aæPADk n?P){Ѥ<z%FQĮbGFV3w;^0ZM,9{W]֋q F5?FQ3wC^e%GHy}ȫq Ò]ύfd!ȌZ!Pp& hOUD\MGә]U-ݍ K 8^; @`}tQ~P|DH>t&#SH>Βk^OU;Gz%8OQڟ%DmC>pSE+0}AB/wxp!8w28F{=9iv9GTJ.^fv:jI9/$L29Jw޺F}n0vfV,§V۩ѽ A"y;[!U0W=.5lPws>01=}<^- hz$ fF yf`nM\ 2 և4#C} 3O qbpռ>~*D܃6M=JT Tuqf/5Nd /F"- (|Tf/pH{gUqEGs „>NQPi}'0q<du?v@V 9ጞw^;D%@Y1Gw`&{˺˛7l\,L+XKݑOҧQe /ٝ{'K)s7[jWҾ?nWR?> _]Vr,ܗzG07Py΁X,WkXWGJ EeqUy qQG.]ղSw+JUJFԑRCMZ;zE):LJWӛcq إjH$Ps?vT:jI KîT=; EJ5ql嬆'V0FF5,-qJ؆:iLyQ 68\caKp6ZޚhL|':\lJVr⊇k^ 7:P;|n~hf|9{OnENe&ɶ'sα2fjgQ^eZD -2%:B6SOj/o>ETz/2.Hm|)?3d IM~z%#+EQœK2koS83MW\W^Ed6yf-`CK06en,^oTT4RnĢ|9׆T'ƀ@JI%>nO()鏠rtIL=]&;'6κrO2~s#?Z~Fi"0# qf|u6̈́a1`NNij" lh 3"8缫Y 8=4`~ Sc%\f0PNN$)hgJ|sg/6&-],dv<"MKo_.64z HRhU;|Sl=/1%?B41gx?ܔhZ&صM5&?pGL"R9>۲=_%]#Hu|OV5'QrNy {qŕQ38ŝ84rt6DUukIBQRmcuk cV9M2cQ~`bXxBگ!g|#],?ޢ}+{8=b|.<v᪤:=~jYkqFEsE~A 6ٳ/tl[4؋ÖfF}8j5KW$*`b0 U&`ФO<x@(z6֔FmxY-.˧V>EKGn2Dc,yX|k}pJ%Bz/[k$%ӣ8_$Ż0wst۫I{T`uUHPn7O845O+$hi3Wj7S lX8GpI{6sɔi yM1qm)M8ߛ9 +duBŠ8{HG5zw1oH!GamM}VxpŮ׎&ČRї5/$S&;o~!AҢ*cQnN1-M'asf k3p }ӥnlV뵔 1qaFj 03I& $2H0MNYaX0[cȻtSX_5 `myfyƞ1J@Zq[< neCAT uNJ dEBYzMQnYU@66H}TKlZo"Z7XCh@ rr {-:s~AF;jT>]f%]"$Z൮ι7uDVy"TU.OKDS3}`ȮZ5ܑ JHsJЁN XbK!T&*sBxgKv2%ZBIKtIrۈDʡ$8{j\P՗C0zGw~AҞ10&% Uw,|~`3M8wET$MIE x-exƐA&8ʶ{y Gu\q{AK%Q5옌K!"[|KO/KŸ[r!H3 "3 䎝K )y8P{[ATA@tú${R\AjZ=~ӛǃC bueT(,f&C{;ՒQR3Dd?]t*EBXciԸ[7[:Gx~aEBsӥE#cuϕΕT 2JҬ%XWyɯrfkW[ޤ3f?%6Ko~$YP t5#B6s"ْ}v35-˵$fK? $Z5\(ڠW1qiHl im5 D1=bQ$ q- $$Oۯ2?k)]v&W-NcK}ݓD<@CpJ41L)%DGe%)G*ISɢ8F+g/1OV@Nי 1;r2T\udΙR7T˃s `0cb0 ot! W:z6e<0p_ 7F*|m1Qq0nzsw;{BJ!1B8'߾zrj< ʣԢtUKu%p]t .w (@B`yo.\c^@:I%S8-Gȝ!3*_w-Ёj_UjW7m)$ r"hmKMɨ__E<9;+Ȁ"{6]v2A7Hj*S~59S ?۽L8+ƌZ:Q8^#?+fB{d>{WX%#W5" 1ZEu 7)KQJs\d?È47QӚ In$id$Vgf~k.OHsR̼#NSEK 5˅q/Mg;"ׯfBPIO?Do.W26Wsߟb~Se)N(HR)(! ,ϼ1]|ѺWPYkS {ׯfn]%8$- ~jcXuW <(AP 3; 7H$}}G5CHɣlHTDR""SA Wu 8 1p-˅0r7ٜ'Ra\ :3y}F(R0;"ŷU Uҷk (yvup$H!VgSG}fIW~a`u;bb AUhՏwurf\/Џi=(*=0_;~zuE%oRN E?;wwţ-tՕԛ?߹I狿?:?g3|!. [IkۓW% xQFAUЗ~qKRJZRbZyF(jɱjHq|R#@9]q$> p mv۳\8e6]g+E7= 3Z$(bK[uRB*/p$RFy\P!F0 i (M(Wb߹~igieHz~Y;l—|~fÀo>qu`R 5`ݞvm1 1ϳ >,)iY ֍-2O/3H4_<5=zypsU3@o.{a۲n`K|pẘݛ#TWx=gm9b3;Pdқ{d F+iFOry= |o(b[*/IbjVac0, `9}bH* U€ )P1ArZ JgjU""I$j Pb?@s0*BVJgM(kncNE+l!ƴ L-js<7u8$B\"Ǖ4$0炏"BwG#6LJC|Iuj8}+$%rgĈ ]>IM;َ,n3ZIDիߋ5[O©yf}M=G,cb0,>sx0B< 'pG:]_y2!0;őv.!3 $ࡹ!#i7Z]ҧRLŁ\dD3^?5G& G&GٰaVTvS%C.} ʰ Ɋ`AJ<0'!D H9(X\~Mhٞ]f2RtR/߳8!Þ]Y=*hja{}agڳ;qMĜ+1e!'x'Ɠl/ہcTt賩;~^R$=?:}t{:~N޼Prpv\N@Œ~#s.* PѤ`ad( ĹoTml9W? *N]5y sqSЀǴgՍm'YݻjvNOs3qnSJHW V63ҹDZ㝯QNs*b-+ƺZ ܽsaX odKp/QDfQr'qbx>K̻sb@t]F(.'ႛ~9$hBk.TMUgw^ e43If:SI$BաUHBX"aHG( =ХFJlh.$ߩ9ꢧw2a1a %;ƗC4w q]A%01 >˓ ;RbS>v/aرIH?m|? 6i):<ᖰgٛϙqc Q$4l_(-X.|,]GP}7M0eMW/B6 Fy2͔$ !I!͋!x#mS!оy66u1 lD ğrQI\;!ZMKI ہ7TFظ4K`HodoE?=⓷V#w]l!h,=:=p少KD+&J^maLM 7@b}>Qyy*6Y)U`=L#QV2k +g W~ H8F_ozNXmZvJw]^(>9Ͱ=hiP$-KnTicmLҦ ^;݌D'7 ]f{\qEe²sv0[ueo97٠L)Ⱦ2yMjK\kFI]rCsJW!- LB@e1ڷ`C*ع:k$pr0O:Rnpf%Ѿu'ձ?1jG|3E9?z3*#cQ{,a莆[N1z `J84jg0Tqc ,F]{%6 sd2:pf+7d: ޯknJ%a);,LicGꘋg00vm^8ƥf]^]g6.Dy^:/P>7kesI¹;<2K48rukv>vSߟC+SQg%&AiBPp'oUmo[^I>+x]k+?eʠHkTұv(hepPGs,߯d /WjL*#+AONYsg\r~@ ?m/b:"ye.MRģF|-w=3sC=S!5{Vk7$h32 ^^YL4:]?MWx<á.0]J@`*n067ɃW5ҸŶPƒ{Jb7gWNr3V9EXF$6kNe+vʷ,ڸdKE'.ƌ-9r&Pu^ЧZ +f-ITؚ Kly[,Ir۳SqUAX 3jHCs`3BQPD~RVi"<1Flv.<z*Z/ nsn}[-JNNFeo)kh5vQiǻyY̐,4s'mإvR0PKq7MY(aSgߔn#siܕ%b&T[ԿNgEgox:_Mo7Í/v&)&O?O@ѳV _%0w|yNOT׿ѮK}{s߽,^{B+x?{.RGϜ=w^p_u~t@=ǑS+^_xw56K0ŻlWxsek^]~"5ח.~?.]>z3߮_?Ǘ^^y{?߮7ݾ{~j? hlMϨ&pMsJ0X<~vZ2OG~SB_%?G֛c3|\w_B=dzuvF?w{]EzoۛL`¼{bfSI{|up+sEw5~vve+ /^;.?:ѓ^i+p۬4M"/g/]05ƪ0Z#s]:=m[ԫ߾]}~<~>o}[0~Vgu3K7\ g?1{vg?ElL2~{ A\/#]x47ބ!t?Zy uzp^{]q>_ <ݟOtFۨ@YYL)sVf򨗬"CNxԼ#?{|^T"(+\o~Lt(_26Ix!k*ס͐S͑C}̨>qg"|B6alCÇB$]3dWzN&┭S񏛛HPRo3u &jYAzKӟgj> PO.(Jʜ*5( u(R-$L2Ua#(%w8cTT){]=~~VT&y4]W:Dϭ+m=Yc %0!peh8B6Cl¡lJ4nmt mY%/|t+2^JO+Xt1ed߮1P zI&;6V#\nOOtYQ$KL4ySFZde}VI}$)2&|l/9;ِ-[V;7G!- 0ޮu=gd*kN(?1ka7iBe IBqMۛ-V$KeJPE,q\ Jd48?WOmVH wSvH aB+܋M#>ꂯ; finFNNX wFfYjCocOџVWF؊ESdmC?W Jo=|,]=U?Z{SRZG毥 A7J Oť6RKQt)\}i̥+?K(9־P7R]:|ԧ9OKԥT~܏ߟ\~%ZQ'h4/WS?.<+m>V|E"IOJ(Rhk񏨂ΊTOuB 2{%s^ E*O( =d#)6 % \(̕d Ϸ|r"-z:%Fßםƨ[AÊ/}Χf_0Q+TGXSNI:FIo_fwQYiw̪U?ծк|Y 5`~=5aE׍PZ@H 9<!UFaeC9yV^=>1_}kIPwVYǘ;22NGp6@x=c8uIQBo.lӃpw԰ TF0:ΐ=QA2ٲِ{eHu˽RZTWF{mCM(sA3 5)Gx{ k8I^`wxg@ZJ.dw̥ͦ:IǧM?qJ@qџN< V<N6UuS7U;*%ޙ}Myj08|f ~ 2f a^*# !2vbX&%ô)XR(" #B:Y`[w<΍G5ϕٰeŎk 4Hp^rD2DMe$Z\ /koZY\ߥ*qN@ ._kbm.PҚC79:"RL7B✖])6 fݛ;$-F0v( )c(10:hg ͏hW+[SYwN@qP8PFywN}yL8 D,EwST*9F{qG*Wr , 吆%}F0π,kbh-dj˲С{By ´a D,RдhXI 8xȴQDA55:(.cV2l<̔h)~ sڃ(~usҘY `͸#THD!AEAK;H7rU Fy>‹(̄bzw>fM"@oRsLێPH(EiRwYLT&KwF9GmQ紹Bz ($"g\lp| QpV΢yz(n|MZ U Sh.tkT^i6WbtqPǢJQ>zØy?Wr @P^u\4mB(^9HqS4E\ ;ؽCfOArn: 4jf5`,\R1k( ,P؆?w1jw#C zgd%ڠ; ٽm 11v!u)S2?g[ȺWK%uo%sk~}꫋wݽ t%װ̱GWhx^ӟ?~-ֽ'`@ݧOO.@m0/ו+Pk*)ddrQ((zWlԆS3+A/A6裛cn6abp3ӏ](Hy*"N }v&~>|  ^pA<+m>6QZ*ybsjUIYᜒ K:9(ZH8,Ej25ÇhѕBrR3K{ xuLr_/nұa|N ; v_v{FNMx CAQ2'6w`߾SG Bir~CU24G;0fuKwSQD7qQ~@EyԴt$Ի~[n U}_xqxht&]< Lcn6>j%K:Z`a qiz"IN: PRֺTР%#A:{6**{p*'W5Pwz+ds -H$ƲCpc#\A$5iw݅F*4Zwp@ Ͼg?Wi5`MM7)յج}a' > g>jjٌqb:`u=^Gy8`!FWuu,*Ue نwV2"8\M\w)΋'=(S9n2v]=f:Sܒg%>}%>-y!d ΢͓śߴwqF TApx # d†֣;d܄@_WgH4Tl A Si6<C8$U+nq˫7ϼԉCAo )C\WWuhAl^ubTG~N;#{ 1]E}S)!Hm$;J3agQ@P|d!"'ϯ"/Qiޞupw[g__.A{Ih^T\ ~>v^X8ń#q^IhPB0g. Dvaib2lOn %RjQKR0Jr!Ĥ&8n&FJ$|uϲ? ̑^.&,$Sm_癝}.F~/+0%gw.csIhCPEt7W,nbnog0_o[2Å;7vok|;ͧFQM$0/J:H6"hr1Bu\ˠ N\-o<1YmGȱ,0Zjm_1>T8V dwHaRaBǞ Rɤd$y0) qP朵2%kO#i\94$ CJ[0t&9_&ZY"{JDong %ӟaҺwm:*wRSGr[7(6)wZi7'Rj99x*B22C:Jb%Ipq* ǟIX@+"mf6"PB'E&q$SJ]dr!jP9c=TdT VQw'H>X.-=9XD:TD9q!F/"!=a!Q 5\{J7`bT_5d璋vwB A+`{L#bʛ?/M#RZcZ%5:Ƅ4a)\R [@P6\mL1O>Ap¤Bxs9Ac1 ԙ(pHnDK9z^ȠNawRk^s5Q#T9,E?3jji8L 0 52Ztm9g q U *j5Ҝ;W+-Mh={w=Lji.Ko-cWGnml/_hE~w'm >g-p9ߙ}!X)Le sVI6+Q,i9'n#n[{ - /mXY9ڒi]r)D*'qDjQj34 'V : &72Uer46cFsq"[ cNL%KC6\(S=_NZm0ܧMR]ԅ#-Dq,ce#<ʰSj :2T+`#< 1pIʘ*% ^FqD[+9`9{$9wJ|Xc`g(I%G SqqF^>}>-.s~|+v.:tW7Ͽcz.3.6~w%3ŪԜgt?- ;KB҆o] RqN%WFk;ur3^j (f{ߥ׫n ǧlخJ1=8"wIЏ.ջ RHHx 4H%PSt;m68+瀖;,.v'k!5U[Y5OIrȬӔǮӂm͚K0>{V$AsAؚ߿~]B`8o}76{U2S (Yx\?˳7knW0oO5_?<Rދxˁ4n?qvfɻh]6Iuu`+rQ0;ѕ_lP]*h/(ת5x4ƼcK,׊mq8ltukk.`n]mbb S8G v"fO9[%8>%(+gzX_!82,$:g%F_entI{O5)Q#Cp.0,K|_uuUuW(B.D>˝Jk]j|j}R3]Pci1H-J6;Z,ʁg&[gr%`{dl6{pߥ %PS&Jl:J<>AH>[ͯ{ QQm(9x/?]vss>/ze$n'.kg,蛷g.9FeWD~٨s.wPև>k\oPdW h $qUرlC9?,t0J4 H,W"y͞7"h PS88.bOTr:IJs @lT}G+cm9u#9Ո᜹:ȃ=pNS]*]]/{st]s9;X:tYf~I>ذW#~]g&=K*Œn xd"1P ?w94؃Ћ?;T±lya?r[>hPۢ9v~~3?J!CZ?ra!t]gKZ**:Ϯ5Bt yV[* >Si/_/_]CdHpf)ۥI ²1#2_>Cn]d||y7B-/hEk6g]i1ѫ祼A-UWsh?V‹B~wWSCTI*E^g=z8MD _a./s)78eY&"y][ x2&u:ٻu`l6#H>i$uq"aD+2ZDTm8 QWXbaBBwVxyb{6}PZ 9dWs[GU#y~1Tn+MO"*04Л:t^s@Y(#L@hdb?u_ _Ҩ!YC,0f~/DkϫwAu]K!W {yZwB5P~v2@RD<)jrb $It6 ʁ7}(h ?Yw8q94C.]gZX+X7Y6%=h5BZpļ3`m¤@^Ǖ4KRhU)mXpIMec dh-[Ck7^@dek{c/]_'jW+ۍsN~tSwdL-뜳󫩏o zNSKκ}Y/9u{Mo 4:js'1p蘢] )I ̂Mt*)qd>#|74.f \2imnyiLY\ry\> g?,Ϝ,CM_'G0'R >6R&C5rum OytxV/DoEw9 b"@BS xpZ<:֔sQ%mpCaq]z# \Xhv5Y&VqM%}V k*@7_\?ozbdB_w*l}ӣyZoF{[$P فU4Jq`ܑ~$5%ٲ+;vJ(5->Xh!;WJd4Nr]Gn@Mȶ:Pf]UMUJF?j"@0w' F:o,E})-p^6Y̦k+j2Y_зȰQ &FpjB+ɸ" һo"PG|0WɛdI ݹꍵ?ǧf}8/zt#1~-}ZLuN% mH^?2`j'33=%˷yPтF#$pZˣ OR)ѓWߗOHٴ}$Z&V*&y1`F::רt(U Eg%@QL`ZXf# 7zcc,8jdWEmakJ3J9Mn2O8|N.VO7C?g6YӪf+W _H]V&hd?Ѝ2Al14 ep8d W]GVћLJ~Si3Wʙj 0z(ph°uhl~~==/o.Cf`+]ؾt C44<yҞ>0[r>Õ^/`89tH|~˛xxwPS1.qP:k@ʽs ^b{n+*:>~L ;E)vpLaǡt&cưK[Qtcɕu!{$s>\]&|k#4ܺD.E}ݧږe/[&6r*B3/\]ɯ6jC]M}kYAاB<|V - Ug7sRI'C[ܹq6Mz~OUdd}8+|.D0"`%IB^?{|6JK{ S5 /xwMT j1R7rI/ lA2>T6 zvg_vQE].ەut/CySRTd8H BSDi *9Ýޅ0NrYx;ԚD+&(Zd-HE O~Y/דVI 6k:66^ wfvaMЋו7E~U mH*̎  d Y .$$eowj`Z,dӠУw>~ׇR?xFovd_}]:׏2h^aYɹQyE,3K.xeZd*.Z]N3PJH1΄lyo3p7B 6qɨ`m[q16e_ pisFByo)x`\2cC6gI/b4l- 1V)E''_Edt%Xtt֐ܐ+Xs(HEܡ皴PҋLTm[_+ [ :a|yRK`͔`(")%x*`kVnBm`[R3Pq}`Guv׳ }T; gH2JghȌGQ%mwQ-ʽ 63+"a1"܄!6>2L$͙ސP@zƒ <`_o&5ą9 %MVS 7[xfv㚇d+zU8<iJmȓQ3EN+OFJ-rB"#ᘖ'3s4 #c+ϋ䒰mN:X\oO{ 4bYD>H#4L364R rv"I}fE4d"j AoݹP=AI GKqYp3$2XrraKB%0FBA!NfV8i$CtwAM1+Km ޓylaM/]vtc[0Aj>z}\o$#iJyg7Efl$GEc@1!Hɨ|*c:*.zFR s#mIjuǯPESl_q ɮ,y)Z8?*C6yVģ.2:PMU~͕83:J0NuPkT[ ]8\ Q Jo Еpb V{&-̵Hs0-њ>b+a87 1Jva66EWڱcoj5{\C՘=(n3m$4CXzWOwNz1i^\Q iU%lo89LU_2[V>zVdBԀ[lq']\ݡ mu5MikdVQ`k څ{p{#6{ׂ@Hq]ލvvQΟRs x^0'&r)`XVy98qNp>Q_e)q __~8oӇɛkή7Wn5o@^|rc+Coitwwmw{ۋipY<]\͞KjuWt"!W:qؐ38VI=lLMF'e/ja^N۶}DZﬦ7_jQ)xoxG!7 eG).s}1)>-[1)>Û HV| tWXd\Xd\/FSEj0JDrNISe  3nQ;Єm:, o@6R3V͐b5^ SfZ蝝wsC,t\m߾5LPZ/ p@O-]!ȉsQ1EѠx4INuQbt*qdimDΪާw@΄` B(yK_1H+(Ԅ^JX9 ꄱ((%5\T3R`&Vݚ )䊀9wvɶ:wՓXiIC(L#HInz 17\'K;xt.[ɅzS/+ZhX͛[I0fSG=iF bEW"FB㚇F/DXu:qF6T[y~?W>*Khfh?I,6@jU߻,n4X]1HYq>h@fR:Q`,w,!}i_v4oٲ[wAsݻ)|C֜SB7\Bs1ЍyKrrSbsF&@;PѵiyfYLOҳl;/AYQJydng%-I,0;[+T,+N*O I tuy6M_|Gh2-#%2 boSt[="b([ukWoB# )|'_!/l#ЧDdNRmNK!d9R Rpօ3m)π9Y 1tCyJyv悉,Q}kID:7=q&鯉;Ű{$6u_EVlOJ:LE53}>hk\^ipJL>#9QvU6j8nisU_ +ZWr0i4pqq-p >@r8XAԕc7V75)%5NeT/نc9c" m(jE؜õu&3 &˹Č(4SE!?Ѷ-zBT;NGI Ps&5wxtw. 8}\波ӧ;\J'ߊx~A1Ȍ>cm' -zcE\! dz43g<%x_'5!/xJJǿYNh*h!%|44qؙ6⁺a8%ŃH+ڮx3mpv Zp#1VIkƧ؋ Ml+o"0 Κv kS]vA0M)wEJAr}P8uVgˮCE?Ke5%)B.x8xӏe22e032"w l;VLשoeIy3҆Gy }̆#B2D$2 ' ON]j6f C'P#,cH3$4-ߐ lB60L{r~/`-l\[H1|E^䖞հKG>'J%|-j#GK%VD0X@8?zmP,sR236Bt`X|MfM@uݭϣpq?~}]ziF)mBܝfr,b8ِ||};׏.iNy㻗Hc: VuO?u4 Xgdd"ꆛtR}D9[b:&PY"j8ԥL߂D#0J_ 1@4nt_w'%Tb} =ۍardp2aue/پ?pEOiJA˿yMT|'~EtSY s 2B&D}~A]G73+D9//W -,н2jM6~2!"=,~ԡC١C١C١CC>sJI\^5]YoDz+^{_> #NN^rA*1H!) 7qb3bNWU]]U]]J2e :)E,:MJ2$w~My_&Dݠo:oGa&|op[Y%tOd:0:MG)o26fRٙ܅ʠ{Ng"撢HiYi!qpus1~Wg Pʬ%W.v:l4S&xkؓs "TձiEBn8l&w^},՞'U1};[8eȫ|wן2֤Eg-\tc 럃~lvޛTǏI~܌9+qMop{0X%`mλ7}vfcSx2gaHYT,[ٷكp1rF\%~>837;Qi ݧd\sl+ebֽC>NO7nvG䈠lgN,Ѫ3'ӝ7Cυw`GeΞ#*&gcrPprXMlpUi%*(YUMg|E:+_߆C0ӹiڹ1==W  XdI1Gk #F*$J ̈́Ag'g1x:Nf' x% !{v9#9)I8fR吼7sI[NAh\3]LUbaڭ'Y (~t(}trThϼ__-Ýp_Qz/Hxܻ܃8CO"s' پOfOA-eҠ'sKim!adan|av.xp.x}" >P*/B@LD|3jQD)=/dHb~*WzTbxe[cpmRM>u/-C@SuQTq٥=5ɒqPTB?K7;NV0qa #ETak;cIXo QI0 gqKgHhGSe@T De/8^8ڻOTfMT~Fcorv;:;0Lo'n [*~anˎK}[*7^H8㘖VHBQ)jiEnQ`~5ػr^)ʁY͹+%Æo5_.WA|DLg//@ [1Ga X"`  "0!4#̄(`(eӒs"@ *~Vs`͠#YO&#`r{.%JJC: :24ElfZtGm#|7S.^A~υ-,I!9M(MCxH7VðKsP^؂Ny+/nb55IWkMhqxM܁n:Kd: 7r.m==C\Jr|0Fdu5EWRG`|HG <] smo8#P+;bCՖ܁#w,UOH>2[/keR|`'g8RG:RÌV]'.:`ʂ4\[>rʒ_.0uJ1ywS̻bVcUFq5YE8.L՟dqsHs<"yEc9&5j2zu*A| >PBGn)@8S08,4M*jhzEjdA+_AqH#$ "gc"+W( qVQSo^$RC G&KZ0Y T LSӺo@.5d 6AiLusn8"u Xi:(kalˀM:.kj2`uչgW1[0fja\Fl3*/&w]0\o~׽^g!޾Fj ZBZjŝ~j|y$T>K$j0Y-c2Z Jhj7^X9Ax%5xkڵk+#yܬR%ffqvUr~Nh6f)Ϛ z{`:|λ7}eW~}d{m4kgf2#*rpͳrãgfYW f.NrY,w{mV=r<قE' %/*( kWW-xQq` '=[p]*2FHG,,ZJ1FF0ujbQ~JfF0IFcy Lt( k!HrDpUAGÙ#DISU3AxA>}}VUvn%ώtp?OL?gt>Y't2p?"нkVIIݻ_އInjǽ}y]']SaA`ؠɆlY% Gܠm3 L*:YTCJYt0$WbWРܪ'6?̫Rޥn,+OΌCCHI|E&_4t:a:y}=oWiJWF/QRWw'w]t|;Y~<!S8рb~2>;[2z˜T {5OXb]P *Xњm|؟xN9sud[.kՌx9TVo.BA<ìPҲ2L9Jބ=0 ]vǤV~P}{pvˍ<5R}ࡓLh؄siwڛpVILԉaH-?8->( 5Q= 4K Ť*8 r-c,NI{05o$:kA30b.jWWŚMo?[zN`exhx,|yš!5Ʌ_+(W-G(QgО^"Rs,w-O၃ H@0 D ` DŽe YKܸ9e[].>s xq``ֵԀ#rjMþ2$1?xZ f(ݰkxe RJՙ#wڴԀl_+DHۭ.q\5^ :+Q+.d%lٟLqʎE$5IDnUJM%0f R~0{z%Yv ϝFK?Y]l fA`jAOcֲ"2SQVD& o"\֯ʾԵ^``}N-vn8:j.KW2:%+&D1^Sސ†k+RD'̝@t#PΩLd#=B(KμRrϊ/"^,@8*lsz$Wz+G`L#K-THM5%6]TV`Оsi(H % HX7S!gj{ȍ_17REC C.]$%W[lfd7WL=V셱4=SŪ"Y/Epnҝlr Q]WIϘY+a)9IEHRLJ;+GZJ4Ik ֢aÌ(^ae6YX KJHU3 Oyaɔ4vZb4Ha BsA_|-N@/g`"FE ϼ);.WN>R^duhCAW5˺8C yʨ# TҦjH2 W8{b;AQf@m22=}64lxo+s]_*sT]]q0ke Moj5egGlT[e!+] #OtmcXeT^|Nk?Gӱg2+] =U-_s35<T凭;rvR嫣Id7D#^<(LD7TKvfJc2yE-س"O~}~yo^}^y) ݧOƯ~tmD}u9C$,ꗛ&u{]o^7o=5o|(,^GaFB A&Jgu9Ezj}"x{_P>}v; lv|Zz10]۷ؼƸ ǎN<_՟>(aJI=Խ扛-SqoL~ͫG֋[I/[:([w~n4ӗ|L{zv.11B3ut=˺TTnFXvpD!Kл8>Y8NG7)VȤUڇr8,1X)y53!!;)a~1KQ^`} i(T ]s紤XLTRZ@`l,> )H%P8 K2 %b e:I6˜xDh`R< ct"oBz)paR!gcrUb&h$6fye0Ihm9$dN*;R:#X_MxoR)d)yEPS%b 3ĨU}_"c+a[)qѬTAD0һNy:@.S`?(#Q1:pF" /mӜFMlT Y.l&6Ԙ+Gwz:pӦ^ba@f]aW({̔WK0l) Tyky"E@5YkxBѓf5{H0mٲ=֬]L(T;=Gݽ"]rD͔nCj1sAQ8 N!$X=HW{`[~D ]l/sAϿgM"5z/WLNT_:\zO͋l.޼W_s9dL')xҟ r#,f^,AjaWnS751G? sN#dEǽ\@i[Qx 0>hXIΈaӝ@or׳7#-߯xQUGޱ|F(5?1+?h?<"'T|K/eCXֆavhfmi2vJG}SQfv)o7_ +dj=)N h4m0=?*Qӏ=} da!6#A՚0RH5Zw$udX킏mN}KGg J{x{LJxѳdV1aQ]zhdɹxAtA}hK֑ւ 5IEBHQe&僞8гhj/ lD 4jS?^0VlRdV'qhb4D&f8y!0 |TІiI 2* !|j%xSjJcQړ"!ѱjoo1JEBKK7wg XuPśQoƺwsu7omߝ[OټB.:J܃uI^K1"jX#.'ִlWvh<УYZp"Dʛt*P_uPN uX&=n଀b򆽕ʃٰpTS&|A EO^FO9b?'M&EE*^ӮBdÚ x_ /D`KX! ʰtz7iۢP+dXS(P#ۦx.v>mP#AVҧˊqNN*+D`:TVMrc-7J 4Q hrJ[[U,4lȎ#mHJqۏ@r^/UVrI볙j<2 MPm#zS>}/}HLC:J]VTdе 5]QK: 얾 {|[zbɺ/{ " ݞ4; rmv:Jocۥ^( #Ğ${Y `!)dnZ|]/"~ Nr~^^ }\yYo7ugπYVoކ+Ns^}|Y{lzwKՙ2Z-yE:_}__^WWةͻL^(G|l̇&eᓷ5M)#ZSNjxP"Z"ŕm좠0=0j^VOw_"vˍmUW)(h\ [! AKb+~Pi|,a2@3- vZi6m+;挜uXlkp'0lsg2XZ%[#fBwfkzBs-omS#g̾ǎg8Qvw=u5_wiph4oZC5>9 y0BnˋaHm̦{po'xwC.Pcd AP1'T}##;ꁖ%b&v?t(MN@yǢ!oPWDR};v)|W||~uh۫uzL) 􎶦{K'N_:awyXwʽ?Ԡ=*{0Hjty'xGYpvFm{n8.{Eef(ڟd<[TFtJ·糳ˋGiۃCB*CKСեGg[ sP# ,}NN Ɂ҃g?gO~G}4ͪ( dgK جmt A"X'²bS q'J>-htA*3sCtS)f>Z]JtA+DrQ1BD!aȏhatPrZN(l ,ɉZ0Xn: Y{\3GA-qr" 9v)^V%M 65h' ݒ;['Mc]ڈeNr~M|4gZ{#_1e|7y6'Iʇ{., MB >k}'`IQ7趙obsȑ=_˭"q j¬S8#rvM½f SĎ|%cHS T O13B;O l9)jhtWzq07gr52TP\է"0 ^W.+6=/Tf hƘ< #Lƺ@}NVou9.ΙzڏX,U|+}w(K۞=Mۼ@ ,үml'f>7fP#fƋ*LՅh]gʛ՗ޝք%_T0&WH%:;_X$yI!"g91 p  -TEIuWLNԿ)GA0rܯ*֪K 9[z_ {>1~E/Ya|(Ƈa|(ƇX!̣Z0LORC5W A/qz*pLƝ|X_gg%\J}yVG㬌ib̳ŕ=]CD8!q+bAeDKh (s[S`[N*oe{ !8A -؋B+ղВh0fx*$-򇳧 B/[]]dι \%/A7 ҙ'ksgmPSHqP ƳkqU8νG cqL*F*D`P/Xc)ʾ}$X J!jiG9yZAg0#wx:s'`$lxR ty$&FUy.8qW]}׍V~̙kBFs66"z.q+ussXy E= (أQ7 <CG&X7!ef5Ba3+/` [t9ܔWB"ĒD*NH[f4E*IThF׹4%v㎛BL OLi׎{Gz]ad:{1k p8GkB̠tL= HKR-<'¿Kr_ƿݓh^0 ufƠDz?(!_;2=2;f]6oa'%Q_>c%yBݽ/?_Y:d\YE I׵w:]ga\{'וOE{~QQjՓqM)I}w X)xT bL'm1Y)ͻFz>,䍛hgb",EdvYu5Xr*wTep)Go)s3?Rrzzr gAi.g&Ln1ثw\j+8!θ瘙`r"onܻdܜT;Vx(%@K bM5PhYЎ$pA8' Z෌$-kTp-2T\H ,'1 gXbMB3 ҙ0`\F`suZrlF (}|_r}??&83drxQ?o:G}N8r 'HDIB9KA_R[ڬmAPfD2_;ch:lPf,|De"Us"-b )ExX2>_!e;15aC?>4ĔaG%7gxj-Kܺ!9# coLAii *va`‘[r !)5ԂIJ#ރT)HL W]\ bDcD#σK:oukV.LbYF Bi:hdžweqs@jQ+K1dqЃY;2?z xFz"TBzy3c~ QU0X $6c0LqVZe` ΛjCx';r\ƾ1whY]ko.F\j!"Ω_@n%hnPEJK$\l>a.% U5ѭyxd;XsJ][uD؍bV.vٞ| K֦?Cqy)ÊF;u =b: gLj0x5-fz*5n lVWӭ6VQ6YSR ġ5iDvOg]5^_%+)i3pSذqJ=SDi1RNv_-?hP88#gԈo8pbH&KF=ri4ROP@MQO[FuV8 q ]^l1-G- 1Eg綜IZ#A̐װk\1OrU#MmzA%/bFEd{Τ5:g/hchӅ%F)gr ¡qyN5X8.U"t,5"pLt hi)kB]aE3.tI֨Y27lHS#yPx MJP}Iɽs9R#"Nہ̆1eG9ϘFHfd]b$Xe5N6lIhFY)cސ ^H ub²JD)c'JPPKLGN 2sA) sK!}fe?Tg5IS1)RM}F:EAuHt#7n)6*)cS[ 'SnĘN3x#γAHGn/7n5Hs*+vE#˘6ht7;EQ@)Jkw$Tc@ h$ny_IN_c0 =~_;tKks?f.5֕7[_>^̸gu.e5W ?r 7 ~iou]H]sFl=\$"$%Eݚ*(%vNޮ独tb~DŽW.N8nSkN]Ng Q7e'Ԟ/; Vʓd0CFROyHc =K}$,SDZxŌCFIC>`<]?DWBLO9lz4Ռu5cGvWX;r9J`@ƥ+r }ЭQ!D)pѕ5T9 wA67n Rвqt`@F fب{з׼BqD =P+&:jp5Lc{E7{+Ҏ,g d+RgP1a9y9g6|*01s.9!EjK9F>!DecSk.bQ;TN!Ge_S1I.L}F#z v}X7b6NMW2R11gnG Nx-}!7n 6"K_k%#J^ !w?0di^)h>~5wrx)y)YDݗ/í}Ν^9wOƮϯ/nb\w_8bK2ډ AnJO Qi5w&UT6EzPE?}XQvMcHyt$i14.g_`:+;ɯsT_+Lor <}"o9%uSZ<+cduV6S2FrJV|¤LlDxXN=yyڵNevR,7FEwyEWU]#ߍ|2bP|.v$]_ MwidMV2B3d5v7mUL2aHS oV@]ȵ5 *x>?QixƋR bXwt̹ U/CRpQh0}5A?EN5 Ka>E@3:5m) Sl6#?2?XnS҄9r\gaw1 &9z4\?BpVaS $ R iͳVJѻ8K& )|5NQEƱ.nwiA(!iV Jв/y6EL$Gt m+(ຒfCwZTZPmO-, &3.,-xF(N-VPw̯콙wcX_^߽;ymoe݄67gl&˚`MO߄ˏsU¤x3o?ho)[Pd&3F9 4ue xkv\csϑ 8 RľԵG<(O[lQjؾ Q<֑N/AZ"8qiZcar)z(!WCn?yTv_gAE}We$PxMćw+[-71b\E>bĘ;4#xy _-\D _>;_vmh"q6GSMo*BN񰬅N0R#bFr߀RDqNu0>,daZy34-CT➸˷ KN. })"wmH~Y y-}HNa /;0&i++I>ojI[d|4ccůu#Y}DZ(fH) -crɡGMDJuϱݝ8ce4uټ#m;棾յy ; ܷzTm_aꌔk=]ؕoXl-ʲ.~u͏|<2B2F:kbY&6lBBs%Sƭi7ix/Z EtQEoѝiMnMHw.0mep:LN$T3iPZ+BU6<'~8w9sVE᭣B4`tKmPot.%.@Eԕ0L!fDv0"FUI.:c B53r+,s9U3.6kLJ8z=6T aʖObg5 E|d||/h$yg׭~DS|18* 7 Eځtu~JπO;#;Lצ~~obC pm?CM/]@QεAWK,g7TG!RML' s +] ;>KgZl|m= p&P&٦":Lhs8Rd*G2wV|Ϲvyq&DmCt*Ja*3 29bH g:'u`6J |윹lDOesfMu(Fa#qf:kDG,G $oYJxxzMӍ"@[ *'8 it l q&tz4pUL6*e8&M t+st9\;4'=  -T:G0%AYokB2{ .ݝ2Nj+_y4`z34TyygB3g`*hRjz1uwI3(F bƼvD ̡R{]~*u9øhxΛzwѴV a2˖D?0Ϧ + t8F`T]R@e`{ *5͕#i9xM&T4sT_]|~\5~@ ;㤆 i PMm+$v=]*.@U[ԃ @U[ՃJな֪; hECm,r2aʜr(XHFxe{oy)Cm,N/ρl4͊$ea6#1O*Mw?U/&2fM#Fiz${ C_`8n!?Cvǐ|& ;d=C$1dn[F;&}:R@ı'|$1i"}#ˆK;ڻQr>vmn2F!礽*{I0op{(+kB+nz P=,h63Uֶxh[40欧͈h)~VkS;}ՖY8w -JP'/ z62U8dQM;gf2XЦ=ӾܐL6mxNɓ* F-iC uǶ{kOvбPI?RSMVE Bgi[/ݟlME=:=5*8>8ol E%ѴOfFyWa!,vKۥ5Ϸ wiL)$1[nMELX"&77@T!`9 ]x!Jr)(mrIs;IğOG?3߳d1 ?g9#R;Ar2鱗Z&@eo"Aī5emx]yk nibZX'%{.vku*' huGE9#u&癥Qe 67[bűt*\5zqMh&A8No}%%jql mpV{P, }~ JKb–1m:\N Z^"";5.=P3QuœU,n*]@7[SZ}eHWyoBr;U#i1[lXqN߱e /iAWl چSS:ZH*?J .Ll0ʉU< Tz*9yZP =IQg$`|y4RZ'uA"~hZim'jF`+!$.l0ꬸNWgn, %Ec^uqhMp5K"ǎmpn):Q $2S R28yB1q['YbfjiYEbԘyH|ZgeKIK+E >x0S("D/2H jDdciDD"R |3Q1f(%e2ve@֨:T~*:_R2Q/ Ci'u'78eZ%P 0MRMÞ3iW; 8E4{R;Psh:L%Lj Ʊt&S?\^K݅&dGN&kǷ7_LbsAeƂ̨.JiZnvXRa&ZK,&q/[}/+&Qq5[$|c_nx`BO VJnVJVf)%nMcXxBQ mAmf-G'5!pvx ~q`} , G@c& @MB)smcâڥLLy*ktDc8qq@\f ssSIqi׌^W)cͫ]k]gLLǷ.]&-cR8*)v*)pO&PޛIrcSIZNzm>)@#<L5[~ˬFVA1!@ws*5fM&Rdro&Sdmʉ ZS܊7Rli.:F>Ύ4hJ8ET뜢RDT's)záк(qWuLߊՂ.Rq!R XTQZ)]any)ZW,Đm>UU&qpGx?0t=3Ε}@wy8DWɐkg DBӪZHB.{ZMoJ'J!5w-}&U`x( Z^T",vY袍$^ܡ:u~6//6?5Ml.-*zi_~~ EE{˲EYQ,K v4](,c׫: k =>P {5/% 3LdkVr B6ΚyR4T(1g2W^f˗oڟh -Ū>!ayIayD:M*RB TVûp,I~߇z/B6<ݢ.n0 Q,$茛QΜ3!Si9΋5~zavk TD <, \܋<8Br%DahB_},u` ~z/1>BC=Жa$4TuUuu嵺/C7̕8g1;`$怗Q04N RQ9zHtIlARiB s0L%Jh{ܺ0! [3l&3%ud qdQ$ǚ66z=N X=#6DILq0}: Mͺ10ܛHS*g\JZw޷<2X!lb8#GHG)q$v^ciiu[dqfL0l(WkWY4V ď"܎ az|'_3Z窽x~%Ej-)]Q RZ@ UpJr]2*@f8Yhbb<8ZufTHy2DJrm6*=z;{@k1nnpАKP `GI")gaD4O•F,ϭ)J}2fHc.BRDQ@ʂR&e,al0rBÉaBrTV$a^)gzxGw6:sb7_2S{vߴ:-[VNܚ $졟F_W䧾%т{/a[OlrDZ ve ??uջJ]RWUUo1130uhrk/@>*@ !9Icj?z>VXFN]We^>VCl4 q)T_sO7G?u}?~4J]{,hݞ\Z[b(cNGRH" SwFYM&Ωfr~ZíиB jLSt$S01ӫاLWU*tJ0]K0UלK8ETj4V4k^aJS! @EM-P-x>|DS& ~73X`ւ 6;kTµ-7| ɱCJkHL+,m?"+$ CHkOփrf`j{y-1`/xրv4Q)E i,&T .«jJa<lDrK`9o 0(P[Tg],NaP;z(br䫜m?J P {ϣ A!&{0Y(}D-AC7#w]QF ̋u,K|:ƻڅgտ .XOqk)&RcV@(^ IoH:Z7ܐ2d^#։*8Oz9a`l@ jr ;c0TR!v5(I} Y10{L{ 3 RF-Hta%P ~TF"$IZb^k7UF{{Le M5E[-v"@BDqˠ4U֢DJF8efƣ'o̡r aP2~`'vxB+ 1; ЉEI}A1  1i6ԝN3>Ä3Y/9m&'jmZ{cj|!ɵ`Gf)9K8?h*;ď5N)ɰ| ("ź;] ??EdZofS[47ӘaL5Yn=h%^hGFǟ#!C.Hw U]6?1 ڸd)qɝb||Đ G3 r Y&PRo^yt6fY9yA w`ýH7! AWnL3IUw]. JBOQWU뙋ۜ{P#弮{Kg"9 3R+Om(K}nD)'`Rs O)lo4?'uOytiC,z4OqDE$y+qJ~tX$:]}Q\1abp=z^U| A8D}캰r3^~G5 Pww\xH&zN;gލmhBZt:9vVmstKgM|{B Ey/;۵SڲDH3.WBI]3KTj%V!DFVh4VJy+XY vX|ziYZ3L$2ukGVUל]efi[-z8jBZb%ƈ1ލ]a6Nok!A Q6pD>&r  [Jp %D&&^@vLusHS "TMSNq6ZSǯ@]9]OikVdsire 6`@%2Q뀄(b*ذ{ qJ@IknStu][w*T4ϳ.HU[/l/+c.QϙwzI_+zṋ~vo\ਗ(ԤixmDW0cM=e6UV˧R_] 캲{'iUWE5R@V0خ^a.;:I{500qLc+R܉H+'40Er >cF_aEU"!v4fd|)D8fQh]ufN,:THfη,l8Θ*ؚ3%҄)ɑRGZK=0w=0tboα jcdy] uVg =FfHk; &-M;mŃowsꋙ\WOJQY}_-]x!_Ijn7!̿y7gWp}rlPK1.v2UWe9yjmQ'C /<(^ I/w];W] Ɲq-NL\Y5'Gg~.T~MFLZ'^<}wD%#sz=;86:iq*e5sjA<$DPPI?MiXvJy^7.ijhE7ke̮|c+MIK5W2wk JX}GdsrQ/f|arK4TҸ2yt1*!2!Pjʹ: |[m'k]t$Ϟ]T{0.`/ӳ{=pFɪdwh >H. 9 NsJ wJ,yp:/ha- c[s ӥ6@iC)B9c@6~T 56j1R4'x+E!KUF S yk 1Hs5?{v9 >} _Յ֖ӻ Pt֟{_OS Aҭ)Z:Z]O'&h*qDY,qmg08T1E!Q`L~%k&x6_4pv2ߕCŰqst*bhۥO_lxwNFgn}gyxV|?{~*ߞ4g><{qr(][ «$_}Ι@aB[Ph Qhx˨DT-(|3)1(mxN֒!NB{j6_59nd>zoeAԸ9Ǎ" ō#Ǎč(鴇iC 9g=ϓΜ$G_N4G&# N: t){ pe.&=׺2qp|e2FswCF왏~x|z1 'u QDWc!ՅӦԄ Rb7^ηw[U% A;i~@ I(WqY!:zᙈJEhr&&7iIÔ5(|ck[Se{oA?|Df4F ֶ Kpv72$u4<rC,=a=DP8r(Q94@)"͉R`zp1$խo4+Kh} X4c . I5_ NօErPBG޹kzg[܄aD;p|tt7g؂T۰IҫFp4{+ N0# EEx |\鍏Ct4^} Fsvi>i!BԴ~ګ{Spx>8}9";AHzuv *::DlFo&hgvH3Jlfd潕F|4z?}b਼7LGbMp%1NCQZ37WTwWo>[f7Uݛ7Y|;gF~q7x&FXR)Kc EpEe xb vFmZlwL d)ÑTqnW(9/V@ tM/% /GOɐ$DNe.grN$$ulXP <+R h %!XIaR^F a,oZ.Ir&Wjo$$)2%Plh ( N!ZZ٨fD:U EwLg5S){ߐxlRab.(P9e%q1:x]P3M0Aqa q1%I!.U\EVfQxDbQlG~m* xdX hGy0B*PpYȼ0}.`6}{G;pcj T84L eaiTcdzQxAƢbTf{ƹ #eԒ,sx@n$fujhRlJ7 i"8QA]LN3=־ "%Fq_om; 6Cq$0\! @=^Tl 9Xw@ F(bKqR֢aKDSC3y*ehVaͨfk = \ZDP؝mFi{jJxͺ}_OP1&9u q GCU:AZhџ-n4~G٩0C[eT%:]T_Op5.Z0f'MUপrpSU9W9Xo SE.Whg*)Zҥ++q `+9?2-чwv}(ךˏr?vT~М@93}WWa5uc:h)>-2C%caj`I*I TxD0TL`)CDsE8RG+FL/5 qu)1?h21R&JK#J: *OJfyuh UWPP )u{0 pM"rj$9,jy'ёHMJT1w92$DU<ۜsj[FU̫tsgga&aym=*[|6_(&( >? PJ5QLOf/y_9gCq߹zwO 6cUqBz<.y&v<x zztyMrɴDsg'?>N>|ZU/ \|&gqjb7[t>c@ f?&ޭkB"D 3u2fh5O?p@_ޠh/}$\b(&m(I)gj!M,]_-p/k_-0Ƀq8!*2PGw-UD\9o~؀;O%LGv2p?;5@6+ꍫ͊ =3N`;Ҳ6fsoǷw8,ɞNh]Z{J."]gp7VwWRu_DDÚV>8:8[1lPEĴ7s*@+" KWs>mꢯJ &@9'M=52gR:z_݁Og0 ٵ(ѼrWovRR' ^!Z,ZFn4~)+7 iʻ cqΎavXG/+lxi:a:cZQeZMO^j!1%tJMYUTy~ӗ6@Z_&B=4}xo3fH9\$LP?;I$Z&- Wb %&vBi\soE9jR)QA#R&AN$Z_ @KDʢFZ݌Z_ۏTn4~~XlN&SnD(AKFـ5bf=5Zjrje{gˇ(5_eڐ{Y{t M!v.Qiܦmn4D%ĿVN5c!lMJpg\:c ?pI/.A²Aߥ)"T#ҩK Az|8Iܪ0 kZ5) KRPI -gDI B?{6e/Ŵzzb3FOz$u,,wf0UlQeY$E š[޺`s6@%TZZK$z8jp UH!պJ2cJRA(Eh * ܠbA@eg(+3LS5ʴ){HTMaayV(Ow?Ȓ:< ڛ*n+eF/V"ՂrTy=lVn;A[{uBޕY2 (#ᵶ6}ܳ *lHd Jd h+8EGwAp*R$Ə',=O^1~R'!Y:nR㋢nI/YUh+Ӭ.9tۅhugf7 U!ܛ[dSX͉C]S5TliW3+C$;Z`ъO&Zhm(NKF2)hSF9iąJVwQ˔wOAS- *Ƌ156Fl _3_n 5BWR2YӘ5WQBk- ŪcT3X(, *kq< Q;3ҳ&yFc"< [bG/VHn.K3!N2|B2ռU0{,S6S˰1̀0Ta4CR1+\;^6p+ +T(( Kd2`d>vA({CX4qj|U/l֌n,_5 녇{&e~Εu|a򧛐u穻:y\qyw Ҹ-1 ΛR(yRŇ0ҷl_Q*9|B7~|"4e\倬8QؔT6ϗu j} \[:2O)̔!ǧJ4TJ+G{uw^{x X{bd" 3thnDTke4)փY<-CJV߄GZ~TCdW4ʐAϧ>ϧ0į %QU"F=xdCF^Pe h/a YyXu~LԪ/Nnwae~{7b6\er|1ٳ+O70I'Mcnp[ee|T# "Gt!jR?C !CZd$! J~AZިҩ`NxˌIA!\{WKBLŖj)"YvR"!.c{Ͱ2$Z8ŖjZ+ a2iiO(`:A72I.qEPN~Il#mN']3ַ\Ju3УrYW)1j?[iV4PKc?DZ$B\n2,`j7ۢh@gr a S1Ybu8m&N8UbeFG<R97>S`sYb0M<80&dS]3n7 C͗Z ^B0W 1 nF:|j(0Թ̠3^4.FyoR !O~J{ B…? "{ zoâ^yQŘ`@]2⭳+ϛ64tƓwjjrs^fnr*c\{q"b$ewD]k!k 7Rp6;_|9 sgSi1sɁh ,--g\[2V!]X69tRʞc qLZՈ0ߩ-};i a%΂SH3ը}u7XRn$3\y.,_ eVRHz@Iev*;>4QUFYǢzNU P$ M{?بC+O2i=D5]")pg=لSaMw cIlbPadͩP~9MMЌ] D2|Ek㪲S3wfQ-XgHR;0N@(k}kRf" A5[b3aUפ4 %U<ob V1p$4,98i|r#cD6 Q=q#G0(F #.jJ4BL|])m0\.O[\ .>TPyKu-~92o=g\16>b ݧ40gu^7#vFH0>ùKv&v7jὶgg\_㥰^ EԌ "@U$)n1 lpY 3,-{Aȁr.k=tL^'4i |=Ѥ5rd˅x tZ @fQPx~/zDqh'F<1" S|Bm'U׺ M!I*ﴳ: uurGr5h\l3fmOۉewUg ϡn\"(Byf#nt얋AJ>Iv;fv)[C4'H`XDg7Qyr얋AJ>Iv;ۥS5-]ybv"ByĔߟnTUr1H g4nGs >e|ZEmS@0P'E9%~*-)c7՜)$W+) fGh0x߬)-w.Ƌ=A~.?;B (٢1 qNcC+2koK|q۵pY2/WMq~e>,.ÔMy?JPY@o +F R]1W$jumٳm 0O=+d*gKNkk" -q;&&ㅍm@JEaV#6,ڐ] C;6BKy0V*pYӾhj`֩rx) ?jTD>=zbWEKOqg%U$x ^dՁ%dWa:ħNK?k/_LaKAե$@$VoO^/&aကqwL uǻpذE!Pr~dƁQ)074_>t oMe c0 3,y0k0m Np ]1Wnqb5d}g y_h Q³+ cqEyxaV ^Wh .y]1WH qB1y+z+m(_sbu~_h _v`,(:96;P\ Ĥ> -kn?O=҄fxT$w+~][8*$\VS, Oq3)=>H U$]fLV]19 Jk Ch \f к=p<hja 9V8yt y$Z #"ePx̔Uƣ|UYS7D?Ŭt QڸsX pt1+$nTmkoZ)ދ)7DY!ʲ7w;ooީ>1C_#E Q]0BlWy$ -(g8!n]֎N=h@w͍HV-KhWf6nKtZL$ۢ,RElUO7@7=,K5'#wvKwh˫3 |ٲoʿ׏5o[ulaH$V!ZIpU]wgUckKKwR+~(19jhPpP m-r9b~E ΅ŗD EF\F4TŤEtDI]V2- R8*#lX;p8Nʊz,7/&ǿ|Me"6*ePL_?P_>t73@q&/;$f~@]smOBbUlsz w<_`gp~dfW7Lt_:(tRɾJ"%&=ŇU_Qy%9 6I_f%SKj R{o/Li2zVWt~58 ]=C#I`Qu٤E*!zTZ0wЈ!C>O9!ZewB SIxCF:r>4ḾIߓ Eޅ g7N^fVN)4?LC%FzKh^bJ-H&`-Ƙ;K={?YR=F~>l- p/f9oMl`ۧ0>S ]BK?KLZ”BAP1넮= /^cPńvtqÂ[LKCU0RmlujUٻ XHc~#Y3 S iF-\ιR% EYYD) BWJn#kGDhK vg!fM'%LdTBH)Iu!6LeR]$;2.4 &U(5J0Iu<#Rl\1>ipŐdC{$G lKxORJRxHAE#H$Ll/th3-T5hG:MY)O>\LU!eN)R֯@倹b*X7qm⥕l,['6&t$s(9_wpWyrD΅^,J22IUBBMmWhk6#ύ`c_=n[ſ Nj;@u[!'`4=A||ܢ|ñTbs0ut<;oa-q7kӛqJ !fQd>|܈ywXINYXE 9hatcP.cèd%B*[)R(//tpvWŬ!IsdF$BA?=usi>`38L9,} ?xyq"wD/_u8 8sS4j:Rp$mϟgU..]ȵf FO?.RTx2{ݨ.[:N3tK)%@ dBxmCJ_~YnQBnD WV4 i\x%SQnj.C'd5/4+/'{Z]L;LHp$Ir0#:W}hs1Lc NKdԗ*[68.h rj,.wy뷝99*Q#1R4ݢJ!HN%FRiت ?( {Y2i^ލAFFO j"E6 (\l{QڎǿTxגt_֨e+0XŮD -pɊaȄFa /FΤTnu_.a H+~_YU匼 ,+fJoJ@c\Y,152K;tY8g~|p-\pcMeBH[@YUdZp[΁YiJhwXL[-7O*(,Un cAC+OfeQ8L5 ~˯YRY1VҖWyT óV_˄[EAd)Y:LH}{wjGoZu@T?k= ED )uwvt= ]G{@-@ӄ>"zPp;$1(z.xdi!@Jr7grya9VI?XQIcʞ~i'*Y~ŕ\e{;y줳X Yu!1˃-ܯ Hb*" X4k-@hƼ0\2l#4 V)P&X>6jq@ W~sãЂx7@l=yJnLhQ'~ag<̡ցmn63cbxErr?y_6~?`qf&fG4In 8Bp wX$>ީ+ߚZi{|=Hw-g qC9Ibax aBʫ8pkNZ`>e8v\4ކƧu0!؁% >& Lx>UgMNScrO\kw~k_Z)VO!AdCdC 0#Ob,El#LX7[1=_})7Jlk!Ԙ}X_C˦XT K/~7Ký^ʩy ^/:|9b8?ȨI!<qނԁbQ5:̻\Ptl-{ax0]s$5^ZX)q~M/,1&Cm+9^.OM2\f46t͸ %X0VkPb:jlhz i[H8J-GXJѡJ3>w>ނȌغ"=Zc FxRCtE,;'#;=.z;2L N.]:j<֩[',=́e߯!$ULnS[[n3Ej`> T1S JaS6f; U0=XWm~Pgby_DW1EK$E).k 9NҨؠ!R S?=LtRq5*H!;3P5_ZaZE$#[#}'hƏ4aR:. ,(2?Vi-cZm_$zwqN:%̉dˬ]&Q NBl+X;ITfϜ#%!hc3YT*|wJ>KV_R_Џi4[ĀOy4 -SEg)z9 *k9?ne|/ŷ1i/\?q?kROlSڐ.ugu{KGħ<4О.r$VLX xEϗ^; Tm^4MzlsS[ᕍt-$lloлT$&33H(Lz$ z٢/:gfj'7JtkFYNy +dzG(4]8-5rrsL{~י}l"9dD?)aj.4FŇ;U@cN%b(&TlZR=HA9~rC Dkp7B`8AM%1 ?Cl*o:D\ !޹mq=7ȀO.@wQ䨞25%3 ƁI"V|CQ N҆ sp X1{9EԍܠIB@![2`ItrJXYm/&ipK5ۋbe QyfBbwW-}<6$kUt9IQLx1H?rUͳo"]jZ›>NM=8B1WQwgϡܛM׍j4F5!U3{ |9U1BdX,_1GX䟯_j䃖 \u IdIqyX \3]6OATEP#'Lf%?^7vK#ף=%Ƈ1x7 H&AN;wAM葍B!as22qX&m3k f! APVQaQ+?DCyثLjIb2Eǡ|֧h_cˍ$c(+zHveJ-)nS9u 6N&%ޭx3C>{iIiӆ8שLiE}i?ObDHS4{. o]{е$pU`Μ)m&I,dX+@e-}ȫ%v$IzƦFBA q5vjeC)7#6GLy>@qr>QsɉN6\oN: 5j7| t*U`C]mkNfmY UN+Եq$]QP\OvXs)hnDH#y$-X|n|WYq_VR܂CQ߿opq>3&(w|`2\f cfʂ`?~_:Dlr?>\|\|,E sU(uW U9ׄ,E^JˌN:N>7|km#G/w=$dۻ/|$9+d%l[L,_Afs5$Qvj,IUwY 'j"~٤T ~肗 5pUv.K_!2IąECĵ3!P h Sq*4Zওa0THj#w\iжpxmPϸW$BXq}>H!f~:'7=,Aۉ pgN8F7{ y9`~*~/z򽅖tUa|E э;mkh0[û``E{@MhNi{yYJjO&*ش<ӻIjdZzgLš }Hѷ -Ed7ˋ/vz~5)ΗSEr3r6//\~93jL9LJ5LL"0soAgOvWrynS{]%3ډ>3[b^0oUŦs|SL\@bE:q>^mDKcN,rN,r\=FKE$#u/˧~5@|_?< qJ8g跔cwU#WN~}ܱa_Ҙ~}{6ld+ K!o?\dv%f:ڐ! 1(ď\_reF$lq!{ZHNp/aEbi^\jƙ ņjϸX\ۃV)yUursCNiǕj*:[DRD$pMQY4Pl!&2Kň`R [^!<xq zXJ-f eq*bW~V@W訙GG kKN갴|HEyt|b5Cϴu<T=,dmb](A]14k0bWǑ°iuZ[)!Y*+$gQX-X, O ( V,jujIJ*^:F ]5#\L ty4r#,Q!Ar4Fb!8 m#Zĕ[sk鑐@E{0۾,l2 6#vLi =F/R-t Eя$a/EQF"JI)X~K61)N0t/ fհ,V}.dx(.cb@ZL|uym;9H~JJN C; H5p~0:U&]TWXh98!BR/9@ TTFz!P #JFeqyB4m23l`ҭ;(xXYgKxk?whƕX,!"DE)(|!U+fIH3c!]:ZBцhQ8).-X/._Hu6`14˨؇E탅ŷ7i_c ~ _&.|KVT~Jg>ɋn t!0؞kD(^`kf)`P&;NZ_ׇ2d ]ڏd10+VqG14W9.id-u-f\nyl|gI:%Qۧl6|s 6[GƖ Dmx#+xɛtpPPU* ?HJJ+VԎZ}±ʗHyTlza3H bY,8޹Ë́o?tfvWݛemf7{~~IcAyz7ipfGl~D sH'.w:_1|y# cE}ֻB 鉏>2^rܑ|"F'; Օv=֠޽#܀ZMJ=T+AA pck{5}qmH{7( \Y*m Y֧ V Et2hJyWb%J}SGTڄWJdvbNύCydW§ jVk+K3R]nt\X'|~5i%+vNq"LrU4=\z@-ᾙK0+caS\Ͳ_ds`sGHz`f!:?Bjc_ȣi-mTYw |{ } wn*K4>M%;F> :Уe{zFJ9ԯNF n*{C\a.0G,5pNNXӢVI ^+ p9yzhRY aTbI4Q;A kQJ-xH+(SY$ _m-:(]Ů}L؈0+I=Ԙ[:Tk J5()F 8@VM@к Bѹ 2&+q Dv1:;~^S7Nnְz)%uj0H%R ㅽ%b{{l:Fy]NB|p4ˋ/vz~5)ΗS%u|9CÈI4=5cMS;o4*mpԼA,Ev)-s\liFQurimArSX.qE4 )GJD.(mԅk+jZ5j"\rkKs*ԜeLCǚASC[Uȋ>.RPVeW?*ކPT'@(( Uzor9c0P-ކ\-WtS;uߘXI,!SVD]wj?onp@F4P! CPFApg$P!C3'a.HtJS*@C@RjtMHYndQI2Dʿ89 CwͶf9='>Ql!>"5yowjJ*JBϏ&/>=֔!mVk+ig9/av5_d_O{0~v9ߥ6dĨ5V?2?xp,oVpNʔgt|~huD^/֙mxJ >m~ I~Z(A#*X|rv=FrJ^9r#iz2!g yEZC:SFeZڠ}R,9L}|9rik2s]v_xUQE44Kj0=>9%9Pg&eGͭeGwuTxn M8u!8!341Lh5k.Pbɲmqa*ッ5x '֜D!upʳ&PMX(Ǥ6;a]xc{:$:Հ[ǥ3!qKTm>g i4BAOzGuiq>+&Z&-v{ĨP%³{ N97qn2(vR3xEU@5Lǀ@5Dpc8"aR -s[y4&&RJwm~ yx l$b'9rF7/3zl%o?Ŗ,S"ղ,2j*Ⱥ܄/aC5۬;)H:LC(Os` [b rC5%W`RVX1y&:GH*ٳ˙&b|p3 .هns nMĮ :I`}:ƿݜsfgdEth.-j>7Qx38QW g2&&.\"KnQTs7TgThڤNM4S$c.0~ `ɀ[nAVz$|o&+&0U!\>9g:l-ije㣙^jA Ps)fVLf&+HN`D^AcclU9|}m uU5b$oC'e,&1+spuP-)صfd AMBUݜz)PVVyI7Dښ (x׎xh]J+KJUH@ p"Zb8C%Sn*A,b[[g22B{ 2,qDfU<.pvoo;X2 2Gr,hu(clkbJ)v˞[K؞%9W' kn™ !T{zX^IƄD5kԢM._kN 껑)NL&09;0hve")i ݽwJkWѽE +3JŘ*Jg5\E#t#-C_c1ZS9ui'˜rQK[M1!{4G Wp\` MF]8* EJ*w)V+430r)>͡ZqZG-Q#Qc 8"@R^ˈ˵u\~JP@-@Ymo-Rs]w.Cb(l5hZ1]ϺԊ-*^+ (߶{Xn(5$)ՁLѨA')>>jN?u!iel._#mP~Vb,o^7 S??a}& M +VNk7;L2.Ce Qq FF=ؑEqMa!/S]3`5B%g` `$=vCϧfG|m,5Q;ɧn~/hp f4p0oƜ̿0+=+^*EȏA [Q}}I\Paɍ&qZ< ]RAV;gwf`[7u3ۇk(XľYS$1!w5?a6͹V3lF~,ӟg9oh_h\3Ņ+:$䕋hLEq c5Ab":ctnG4Yo- QW.92%'r|̷}m/(N'w('mOxJ{(;hhҭaя+:2xyNH3zD#A?$78 X7#?h1U@ A(-_CRedYEr">bFQo XHnceΪpL:G[@EfBTYCHxE1L*o҈+9D{fɘnZKm7LĤs2[=JxϤi`7V)x1bȵT+LUtbBqaAbơĠ~19$.2`Å49OvcdI0pj삌dECtQ”!q냉#qDvD%VϻX2# : !ZTvJ*  PWzo%  'borA*6𷕝/W?'N%1eSaC{$lιzZ!̎pTU UUXv*su0X7$*Z,2ʔ.^.͎BuS:buj>tM*C9#V(E1-+j1ju-:J;-rd #BN#QLP!B !$¸TGF`5SHh"HV,4kOs pKfם̡k6t Mɼ\lҳtõB~Ks$t<M|WͿ\w}U[.$e[1yJItiR.C@ۍr~x/KRm ,g|zbP:mo>~4PX}|g@G;x1Et# gu?/wMߋ3O&os],K~v77| pmhBcş=zG49jy(mKpR-$J ҭ)!*W4V2]X(ǪG;¦i- DBD.zn~"*l[\7);\+0(+'$^!2* ]hp30*eT1]j!_$vJ>(do:$-c C4cK\mGE&YDFK 0|JTؠL)֓ J=-"F׫ݬ=TM>O&0Yҟ<$JcaW2aՁ6@qR϶-Wo8MeqpsBhdt=UmSEU+}P XcHZ[}LH7$X0v+9PpfG\}†&6랯U&iLK)UA2!"#jx ~n/-eo !ZE9rq(3[?S 9Zsj֪ e`\NR y!׵18}(Z$ٿ|:oR_jg¢ZݽK ;GXUHxC\#G]V0/5B/3ԅ|Ac.R91 _CV- b֧Z]p0ЀeVeO?#I:~,GpjscѬ Íc ̣J]ݥ;&-wgoa l'Z-YѺ?|i?|5-h|ZwǟƳ󒀽Ȁ!o]Hӌ=5ueC9*0/#j h}}5@f7_|XQcwZ ȃ Jb.5|/MbHw0C0,GtRk.o> {٧GRr!X=\>vsPwat[y(?{WǑ OI CS !FBʌ)dˁ{w)rȝ]lΊ{8ش8y^zB,so/ܧu]`L//0Ex4,~_ pf`:ϝ> 0~|پBEfț엲~}`F ΘlՊ+*,v/kM&ο_ߝ&qEU.aX[ uaz/ Ҍ|Ƙ 3$OM7͐&du$2eKŵ6ֳUF6[8 LǚK+*فwg֐9k4w-Xt!g \!!g'+--gIHlZv14mf0ux!Z0Pj͊ӣExDqU*Ԍ:J&ib<Nv۔k.8㙍 lZP[ttZU$G$!Fϥ@ak1(9+]!dU 󷰐Wn%~Igٔ1*|c0lf|"X $}C1s󸻼]d-LSړ "GEKC r!On"ـ,y/a:znuHΙ!HGëm"lyƱG#`R,\-/V {N1~P:B023qN%])2OqDּnx"2CPcEp!9'SBGQ|gy|{*Zhe81cPb*-\w,&.wwҚ_Gw@ =Jx9OtZnpg|o?:~7.$ AwEp>MI-5LPg@{C(&d©(ןnÚou5ǮI1suLB 9T_UX1D/[qj/%hHw[} ƁYEhmOeS5Kǩ@ق;8q_O-6Nn-hNG8=sEZ8E= 3cyDnv+jĈ"Nd KGh)2]7oJseny TɎ?/JQʸ$xKT-VP NwIaU?H3^Kr@:"6JX3+64h0`'F5cQ^j5 R9E 3Z>Ghюi؝+{VG5'e}yy{ 6 sѣ'>D6c?'2 n=ֳyb@)c^Ì*&!zJMK#,p V㯫T9As뻋tWya߯8zv(oE9es32edXrI P\[`.J-Z <* [EM&i5 1D_B"['"֜ ) C ]ڊjΎъ=^4`h XЂHEXQX2")9EM%BaJ9X(=$98M)hV0$`ԭ`#D)XQ/r#ۚʀ" ,FD`m- A $CP<@ O AGncbxa`5Wα cJ"U:nTd&tg)9húi$KvORYBUxqUPi8:϶q{=`Bh%Escx nLDߘqB@)]ı(% Fia;Ϧp17wf}|%^_!t"GC8|>cg-@P̫ݚt> <|9/m=ߔۡK &V84_qGs;yq[ o86{_o}躥׀ l|`I_D/U<P=Ǿh.[ FOjTAe@YJVM"D p9#x͍!5"FY^SɅ4:UTw-b4Abq%F?H'|Thо2ݐ :w}<;{qi2ɓ>QZ 6|x0?;γ )ޏ"ۋWVcgBҝqZRWvr*g?{Tx(U|~RCgj.z :{췚 '1[\/?wĥӣVo?ig"aޭhQAZ"%4h^)-LwD~#+ُy~1\`'VYZ^8v[Qλ򵍫FJd)wW+4OJn69Uvt+Kmpmv F`=yT/PuhTR_^_-Keӹx^LI ^\M\y9ij߳?> 돗 s:}|isT[A3֒><\ͦZi&A]K|BW;Ad>.3]JOnNznM)U>jd!Dkl;|;MwZj1(1g nFֱ[=Mwka!Dl٧Ͻ$Jޭө}v,dޭ~wka!DlӛMvj1(1g n#Avnzޭrߦ/oqy>g[^__u -2,#v "ig.jĢ䶽|@#ӡ9"{f 89{,(gA/˼s"ag@/S"IBSCCt??^@"-lyy a3eKa'E%t;Pt,?eo=SXjJHP VD0Ӹ 9 RsՓ[x IB|J5)Z+~_q .r Ȇ ʊ:UzC6N \T\,:]uߩeY ԄGqo/?|w.//i"ƻgWEoŇ$c")P._: 5bpu(!Wj!a~JX-w&[%7eo4H34"x׷3-ٞm9bkl?Z^3PMVv{]SF||^tzf PDYr:`N.zrrrBdCІ!2S]aCJyi)u7ΰ*T\+za䤣X)NfC&TTܣa0Lu'$'+oTDp q,`I`v!@42G/5@)(6.q“ 8MN3"V!HιL^ OҥiRJ[#ahURjGG+BdZNSEL&PʈF!sp5000di%j_%(C(}2TK*\^n3FiʁqG!Fn9j4jHI*WқW)(2eZiюMv"(t|jS3U!Th54çB[(Bka!D+lf_nx7z@VA>cw;ܶC֩&y ̅ޭrߦ|R&$HU_ZhB ]p-ܰ+Ur#M5,.xmEhp'& ,o?i`3f'${|z {E NO<o)b/FLsSfFB k۷3l,ʟk~xTy;΍f,~3:Ň82.u ❪ \-\8e0E #'}U)Dڼ!1V.lXn܁S.b:#Տƅ@kZG"KHBcTIPmK,}zH'|Fɯʺ-w_ܡjk|7N'8JSNJ %r%v;bݺ6;kd~x띀`gB'wq}&1o|| nʴ}b4 3;gH5av\Ȗn4"x"q+R"YIfUTf(3QP2l j5+͗K\npi00ګ7\0 0ƨB.B"d/Xn+Wd&n?_gevND@6(. G9U0) .6T>ySr9߾ycD!R.%꜐,#7BdQRbXÈQ%15(Wwzw5F"7uaA17M9”}ko.;K^U=K1G[W'0J å< .HOa9L"Ri3v6)UWUa4 \NR3!AX{ɥ*[6ʋ9ō,XO ȯV^M FB^"/ 4O%ņ@ aMyO3p>F\PR0{aO^7 i΍U7!'>~ȗ<|-J,J@}uQepo>f5QW2V(魋e<r3F ؒNPV1Íy?yѥw^;+Qzp VWb]i%F+4* ;KdVaQ2)cZd!]HKx\3*QS ArgZ~tI.IBA' y2yv;7~JU1Dg )]I+#bLay|,jJY:Q06j&̌y]j1kZg,J!6iw팲]HlH VLc9DN+2`mB1=)0F-#tLpB,p+#niK-m V0uA4RT`,mn`vN\豄 Ģ KLgTxfTT r2&8n5 l2A=ܔj z?f\zSSBPm!5uCED'Q 7i>}GRZk)ۉ]kI% EMջzFPT &(l=5\{VB?ysR}էg1O7~&<>ܬ]<.W;Ozi/-\I"&X^[mPôx;_ GЮs6`V:13lŴN .11W/z>3JCTL 0jfuu <.FKT31%M'.]Gd{P2DTmstuJ>omiv :̍oDAP? -бӷ&{]qG*kD8c򂷴>zÎ#y>Y!?w};ݻ/b1^k>7 J Bo B :h#( WrS$\qleiǁ9R"k)/( |id*9RY -Yi29ʙ*X{*=Qhק)|nt#Nt0BD̮?e852b(pa8"9FfqA m0TK<Д,D.xNFF\)K^MH8Bc4=E%=6ELfiG?}mr}ؗK9o2O?ea>٭ki]%fLg{>YTNL9'wn[,&铅Xn殺bDޖI hD:E/ew9T`oh<^z*<T1-t:H[$~ #5hMtGoZC)XQ2E(yt1S.31fKA(GHE5M4*@nYYjOJyEVī7o6 %H&r1S\L!wbyyI8'S▻[ʟNJjKErhSCFQ]wjܻ:n48g}!v3PGF9[[ RC3.C!T*,uBفWtNeg: ULc_=9glcbQ.{hCTG9 hWO!$kWruW Ƌ@'s|syl?~K] ?J_,!ɣMmb #/ z!=)JL 5Y3&#s<˴߄tugϟ]~W(r|A؞ ƈ4?6 H7Fl幽lwv15vLs#ƌ?_h^CΉ$\mu::,\\Z]\k]=TväHΞ0L偯]+GL{$ m>IB{bUQ"=|4|Ȯ >"THBl\*lTaէZЖJ[51i]uQɵP=E{I`uQl6ɸ'RVeGZ8"'ai"bZ,+B%EA%+9Ej;pC8Bf-ZYw/-N_y):DO3TK~6;Tq1ԿfK2^sEu-W'X6Aa_,Em\5X & ~Gq(%SF04n^B %j^QT0ɴRPr^<<_BN%BVX*&pg%qOTE@W)JLT^&,Ũ<;?gxU}H 0yx-VIfnߒrE3U޺F+b9"ԣ8mSmw&m0wNQUњ_z+y3oe=+Sr>VV-Y 7$V>^s-僋vFj'ĨSw6鍑nzK@ۆpmlSiD,Y3 Xqi2nit2AEQ\?c,hdĐWu`#B\ ]y#uA-:A*ᯫ/ lm0fHT:c7Mh=:deUO`l4+_ kݕx*ρO-Vp7-GF1+$Rb,\A_'ԯ 1/3~| !"PZq;,B21Ւ)qo=vBh|Z gji0QҐt.qYi - ) QV>lU. KLKN-ɰ C V*rDLFrF`<1L]w\JC+(嘳"3PĹDZdgPH  s R1T0r *Q.,PK V*ks!a `VLrS5Q(/}y*dI1xdEtf1&SB)1BKh-KXgX"=(g'f'nzdsl.5e ި<9؇]~HM9'$F.S>v#-S n[ d{#T^2fw Lj4=>3Ӎ=QZ}@_2ltq/ |aH3_Yyudk3v_F9*ɋ\Ϯ|y93ۓJu۵Փie ~pA8Iv0?>~]bXя; O{kSD>7I9/ͯnog7S#qMjЩ.{L? ,{lB^ؔrwsv: A$>#ʻ 3YR_4ػ`!/Dl ַj&'\mB#L8:>k݁ÉhLUK{_U+W~YTfW {W߁=+2Nq)0$> |t;ӫܿEap˴BqB/^0 4RS:DB ziu<~RNrRcJBPA(,)cϱߪ&b/ 0K=N`KF :$9X)F^kT FtW !t^ y*\U]#LWj]^,x`P!b.k$:$*(@" *>Z) ^[+$8,4V!.8F o4F( +K-cjKVJr+Ame2MLY,f„pa`(w9'l/+$gCaJadLra=lLS!Vm,J "Xb;HCgG*!{_}uO@ŚAWCT šLKU#ecu;'ak.idGpˆ{hᶂ@TMX@6+S;pScUna=zrhߏA)7(Y_4P3?_._~b1}su6oFIn,nN76NO>/&-tuI »T~_LviTʺ)ȞbV4}fC̳r60xߗY܋sͱ.zQ~X!RpD1ӤN?>4!US%{XK0&c_N{C@``>qaz:? w_?|3+͘ Pe9SY#Qn,ӎ8o+U z.^]eQ~CMe1p xVJo53M4u ihS &%V4=vLs"4;t`$eqSW>8Oz|@5h'XTVHH;z =꣓ĥ^b\ys}Ix]֤/rX1w3@fsNڄƹCKc(ghs׍͐=TOTk|i1uM~uKM*qKzƭ8ʭ'g@^$Z=}r.&V8>Q1*;Q>|~r=U8$Wx@DD!E`?{&]L#E*܀I[(VKWlDcTkj?AU="@xA̓rn=y}g@p)Yw(RuڡOŃTTQFz6Z材ݴ̫BSm\jګ!"H' Cv kA,: v:{4w tSp<_gb[uSp`/+VSEj 8Q>}NM8%#b%888F\X`\_OTS,%G5cWyȱI1_$Vx}tY`^ XQQjL&y 9h\`jo 5 |!>0S8MU@:4W}Nur:j\w{Tkҋ^M1HiUў/y*ɺ6'vmCB^6)zoi7J)vkA4|G%v>xZ*عvk:dւrm$S[n\B5 Etv;(O4U!!\Da*k7U*Q5 Etv;cxZgn'ڐW.eJb)nNv/bc^;QRC4ۣIbfZdJ-N2 䮉|ƃѦm:}vRZea>33i)PY̜;=&ILٙT.Ԅ/X/g-w>+Ic@ϣ ]Ho\P3Ϣy&]l zw}Z1St4i羋rfx,P 9|Wy~, V[TZ%pG?/{_vyB;^1Q(}e bYRZ* xbm[-Xa#V m`̻HlKi&J O,N*An]No㼝x/,9Ҿ~p+WL\3bMRiO3B>QP;]Mބf~~Sd z3+2c3|`?o~ ޯ?{Dfb+cG !R*e]|NٛҘ 9Gy)19 ;땐TpG1m<$<2`llt:ZcRYm+J/VXVU~FTX,%_ԊAR(ly&&qc76K'Eȥ8!(Fﭘ*hiE@4oLH,r2p$q߈a!85>}}CT{rs25Be)nQ()tA2;BS9b #_qp%9Rtpy3g Dy~͍DG}PEŧ  pmęN'L˲ |qb簱0y@8`>ÖBuRE  X6ZU[Cbm?k'wD7g/W?*c]?=pUb?~7Hs$NDl{Gᱡ(̋ނŏbuxzHY98??x6_L;4Jq0z$p01V=1|1JxS}t8_Caywcʼnip Z{X >!u ;MLd@pMQ1Zӈǂb0Fq - [comeOje{S;;5_EUp12~ @XD,r1].x: 9BQRo&XčdEݒ;o`yiy4Ş݇,w_\n;C,IT%SI^xTGLs*;>do8bH^!8qx(M:Ry+*}Rd2%c+͸J`b+zâ0bVa M&UV`>$G2e;=-*%+ܚǃcQ "똰ZolXxDcR8Yэ@ *]v;7DY j!&;T#}vdN !g nAm[omMPr*]wkģ?n׍>x0O۠\J JZo>zRZ%poaklĪ"/up=Z]rmQ.~[iȎj<&Rx;Sn?Uy $ ZٞT`&hzE" &̆)#lQeLJŬ iB#NN`Jziq(Eg&Njߋ~^43Q0!PfdS)\Ɩd^cc ral5^HA8ԓ 7j_[aP)p[\#Ra_Ͱ5ʮLmgt@ro F쨡@|F^`F W Bs.Ԛ3D?[ _jj?g>*joiW}T@ੵv=IH>i7A|gZOHAJ)b4I%V M^lrOoG#pN aغV:R@$#-Ed_in3*Ad]Z; ;;WN\6^ cP'PGM5Nr7vk~" 7ٜ _.ek=Q dxg wylɽ5l?#}@Pm譙jm}W1u$_?MM;z ٻmtW4rKKf$i'M&JrN(٢dI/ m&v,x<@D]Q}Z]7[`y!& O[mf{~ƀE.n<:c` h26vY_=diG1yݟse7cCItX^!C]D} Шw<`x)!(mVGz}#v Ljr&Jf/| <6 <9u;-&CazvTOwKب0xP0(.2\pdGfXQ|A/QvsuO!CO1Te$qçOVqqRӌDX䣴|s Zq#)9p`kQQ t\f|[G9+>9g{&J`~kc4*#b$`N*؆pF/mR"\a{vIύJV=Xo#i+/_GmWpg7wn*΃~hİ\~߹K_pjZYs $i_$,!پDrfX!3~{LVw8b[Cq<=qnSٺ8rz9ݷ3? kb*cU};*y\6bblL LejXm7y_ bݰ&¤MWڪ]L'X-o+ǀVtZ_`ڨNq:nX³0eI"=f%!/+D^~_+iͱA쎍ZGA)H)|"SA펈H6+]wYT'[fHgD 쒒ZWX8 G˅AҰ/@A= /1UrQK#{73.(e8؀O7;ԩDPNC=m ¿t#.52֣c{mb|| zi;@@'g/2>bgD|vqM m h,v~ yh#2楹ب{(_};Vfm~=yX^W[:^3p>.H `HƜq224r2.po6(CHXۃǸQRb\;:3Yǫ>nقHg m: qDr'BQ$%R> 'e' ]+%}Б$T \ Qi5ûGCR2, |.; /BضAf L);˺AoP>4DD ְU tE8v@GlvCVrʺl0g gpOHa8TR늳@dU(Hd*/P+vG]q*aI`v]w2EY*$QZmV{QK%;6㩤U!R,_~s6і"C8 <|d3^j19.X[~M ! {YS_;bKVD8/g1%+pq.+]h;Ji?jiun DBҎTHQ[B(TRh.<|̦Z0:;D b=J: } 6:3 5~N7Cy{tw&2|b (ꚽfiv5KaKY5REʺ** e5hokW[4ik^@Rܒ/}ML Zh<ȝ6Z:ٖ( % IhMק쐋@aYX^aa5Tɡ"J"=XpTK)B'YPo&2x:!Y16zisI#%O 6!\OET0Uʓ>]ݵ<)L)(b RNk C*) "$PPµS5co=;ƮYhe /u"?nI]`ŏ/SEX@YjQ%̌ۈ%N̸g7KEͨq_ Ҩ*ߗe>Ƭ+Ԟ<3_W[Ӵ 2g?N'炳q XȏO8 ¸so!Zs uTLB2^!9m8H{a< # Qq4vw72L`L>9A'DHA9IXSe(g a͓4ȌPBEFaJ0@y*ZSQj-Pb`Z(©i!ӂXg:o+|3`Ql6-?5j>?$^Zcʫ2PRr]RI  '`+Zmz{oh :dm@A%h bYAQ%E fJ Y* eHX)$ Y!pHMb} g`nG.tmv=%9;>j?mXx&$ v_,og>3I=_XB_gOi\x\xf-V *_LJSwn6N̬?\^~QYz)7 }MGpNcKraX&7v4rBOW> */+//t; `VA)AhHjR?Y!0sNKI QWzjvux|_|5]|0.t#Mg_S#4NkV~fiz'홦Cd>ͻۅZ~A}Jsgلށr)у$_|d=^/@{ri["a!V}{rw}'ٶB bp0uX-d4 HYu@[e%h:7 -ƹm WI,y@R 6AR7\DRto64|\^M$d3ٹ0|qܴwT'bQ@)KS5Tdn>u6;hv튞"-+f6V+ErC4H".+L#AW & @4Ow7>\~Z(1g-!ޯ]ޙ8޸s8>~ W4 #,o]g0 _]{x\[lx{& o?߹U7-s꟏{x6Ocb 'wW FX4]/C/*p'du.01 .l37T@qRV3 [Y:9]s k0HfuP} gt!efn9j܄2|% ԸL[ V19/蘤*'89+2djqqJ䜨@˴W7]Oᷟ9R80/ T|HbBr eBx`z5v<`DzZ`C}MNC("IG9Fw6*_ʊ ՞~D&ZZ=ӈ2*W>ε` I9֞WlQ.9]1*:R>uciU)%ʃ wmoP\zሆR=P%b@m4!|r @  -f rzX,>wa}!C]/c۵4X]eg ||. VydŋZ:?dG*Wv.kIo{?};q1'>s&};3Rb j%GGkyW0ND JSt4&ȑ6`;FNeJt #8F+1M 2)Kh`3(j.!I9ijS|+L @3' yyɛB⁑񷏲yt=OH4m>M}ECL;޳t=#zRV%G"x@+@d"hAPt1#SA2t,%@j)%(~E1v^|2wKE]|Xk4mfሾr"+E> PT`hu&*8J4pF!w["V# _S- wMƓop^joo][[7+ ?.F /E` &H^AW%gYoQi]Z<mFwK*YdWaIYY 㿞 Y,пͧH '|:}wo]%[gd}R2oM-~͏昫贑FTH%;Z?? 1Á2_gpW#ߧyCKF,+7>6%uݴ0bPc:}ƻ.v+:mۻhwBrlS$@X 1pi6O6)%cd잗fnt>h">aPcAkS,?V8WM:EHۺuGJV>ۿAqv$䵯,HJt֖22K = 5r 1ЏC+_H,id{^# ۙ?{Akn[E*MRW%WY٘Lŕ)lp{Kkc}GJ-xm`\H9l~}=4zPqmkz}hFC4_={k [zʹ0ٟ)[oi~{ncӒ|WwT/ vUX4)z  pxDSg(-?yhoί;Ǔx?D.__JsU 76l#꠯ܖ٧$ğ ?Q``B[1d8 ~誼j<͢ԋE|>]@i{~bb'ȭd IIaxCV5B^Ōh)pԵ%pFE[e ˛҇si_J'EQxG{}ޘ2QAR4WH4npxӼނl\P4ʖ 1cP(q[tQ)(\aX *jG4E.O:T6FbDgswr򊾒4~yPJ} vn{g @pBN{̄UiQx Q$me`tb2+) QNAf*wUi'Mv&(DbpHM1'$ QZL@o~&m$+!-/h%O32r0L췋[כռwoJy)]P޽؎~g>O~G΋|،v9~/{D0ˍ>ԃoXy`IOlD611"&7f}4O5/]7TZ)LDmNf5 zg dY8*"YA2ܨ kF,V57NUb%]/c͐#g|DVd0E&BxȞ{JHi1 mjk5[lgw^Eb4=-Rq%}ey0E!(v*KwHELm)Hpm2yJpZuԖj$TBa =}W:W -Ckm!&mak?n-6ZkȺ M8)epthBrv<ѾY>D[2sCIqdcR܂9d>[Ce7T2@5L6uKWðus%R z\ٮ=Mo (Fm[CFz4bp[BZc-x8!yF㰴kT WFxB(AUGJ!_G GG]6cYs<: _9F3wM>c^ vZm(>Ƭ>~{ºꞼõK1oQ-4-QQvۙy7|vtmnb *VDaM_ذܯd6`8‚Vy6Z`Y <:=pI6;$SRH{?;ȓo?mtiD@r4bm;RԖT}Xt-:qW`F{phv0v'-;0˽ v!Zp gCY]p,`xR{'6t0hpOO\[>:aI0v*: sR$b.3à8#5*-ԁ&e%,Sl*X30U]UWʺ.Uj6:Pv>7IO<U #ƴaʑe`-O7ʪ4% ۪emn k}J. 7Fs*$T-6F-]+sT$t-̽u,JZQ}P2Xxt`葉R#Xkϒ:@c&(Kɀ+lS0ʐVx`1aw-kwLюK)YLQEVLW`iS \d)9bf Z[eZHLYBJd@G•Y 6ps1bN}XQICP{aoYse2]z1},- +%z>G$Pt]i".q'FX $@]:>-pa:"7om{_(9/9/9/9ojb"LbBfHt]*I?R |vq}φH)7{h*k".S<0Ցe3E8Z8Lr:"eBkrW?}⡛rN⡛r͇?[%n-_wbNۀr.:vMyֶ(笙9K]ƍlèm =b$ MC˶]2a#Uow8M]jB;q.5&ZU4o_ jz}G6(hm|lm|D{ڣ_Iּd([_ jL;xHZ Vz!,+7>6ř:wFzԘNwn"d3ouޭ?wwBrlSqȮV|n3UƿR|voh6m{/Z7WWʫ)p/IvUj JX_5+ɫ֕ۓW ^8T$.NhE @昬+"R0 @Fi#/6'(ŨKb] c/ `[ ]7r=,\ ٠1bS+qiK'p Ov6v|R AYrb/qkVE7G.PPJJˡU s-'Lp)3[ e>{rrIn&[64rm!i! ;.dѦlO$K1M]KSJrCi)h$WV<;vܥ2 2%TL7l|~,UiШ?t~}gG d%?-g=;А9(6"UZNQxwϾfﯯ'cʽl f>:Ʊ:&*T`U RGZy*1GY獪>JfZxJ3uX{$ Y|.vFB~*V?~煘 _]RW5yțo.%Hv.t{/y)"O)l*l!tש-ޫw=(\هo}yBٕh (&7%ˏ؆ %f?|nߑܪew*_E\Nn.+͗)m9\֗SlYN0iOk24ܳV[i]n׈(?%jd)>ҺKK?QC:!Ӯ&NxS\!5E_-\Two_uidR\0Ǎ.XUZKU'z ]UzrP'2'uѴhL$xK^7FQUp`b _ȔX'?9n.F{}pTTSVJ^ÍN'Gv+SF05kl*_SJ:(sq:FRVgb%\%/֒9C)&e߮J@SM+#=V}) ZTsS)jźd|*p?u7 K<򾟿ե`E@}V䕰DAt톚셣:^Z GrG!L`iljG&MJd;v[Qllgk޳G]Rr5:PWhw+W#39Fr6x*NnG֧iRu t];{usZ[k>t+[|ӧ{9*Qh8ь=^Íѿ,mTliA)ό&8҂^3MuV&P-gos}e9mZ<tN׋u{Ⓓ}s{>*.vkpyY-elKꙁsH}Ѓ(a @|}ωL^wl~)w^ISyT5p2Ǫܝ\uĤѢ惉vjESmE(`@ Ep"?R,"ס5Yb9%7l;u}jk^$ƚuiG2- z(K2!]UmzyGV+Uκx*&ZJjE|ý&̫n=cu]^=WCkV#+1#ԯٔv#~8q\Ĺ 9}t]zRJVf0W? Gj~2@ԫ=˵eԓAjdžׅ3UJ oS鼨LRC"9>ao"N|}gɐ0O8U"q"I"j*z.)!D^B^ MD\o K}^xU^z=nj*7EDu3<ݩF![iETWUv:~N,0'q{@Fkҿ\ވ%ilS Q㟵o>.c_n<>?T{X^1CKr=/\``/GԂrYғ|*ZSFGݩn9uA>ĺKM[]pn5֭ UBhtїX7p[ rT%mDL_j"t4V'Ӻ!߹V蔢ZXO)pK"M֐"{gsk>.Kk;6G+wzQK'ն=< lU-hwבy|xyr$^BT[-զlM,D Iag+98C D%bp,cuJfRsE~o#A@f ̂YeHZ{z:Fvgvp{>e=F¦h\J.2]2fk]sߘ[?32k"lu\CPME#ޟ~ײ6GrZCR *nt Ró> 8!DZwq?lwS f|&MȌAg)Kbv(EUr=CDzi,7OY yw(p>Yfu$z2ebw^5a<4}@ t6|ye @17jOd&;rN(2`:'QHKa9L7ޤ4/wNЬ.HuGӏ.˛r$ճƤ}OZKgRm(%Vh 7$XtrI-ThG]|DH TZlOcB5J"9:i3XB}}#xS!I҆>() Ɛr+2=F J H `2 0Nt#SR9Qcda|i)$w2LU)j29 YiBn Qk5aT"tnx́ڦZVq(zERVI]M ckDλCo]҈i@"N*ZL qC#|2s|\5>AE.t*<=+(KmwRzY1XuV̺8K풌 RGcZ2jԎ]j1Ԏ\]jKpZ6ݷrTw^p@OoeRR||2g )÷}]&NѪX5wUO#/`JiVW==>eQ6|רH""}=bUHeؔkf0T*5p FEXbQRF60:ұ27n#^ 5O?懑ޅ6D*`>CLI$Ѡ=x> Ah;3FTi")d!uqՓYhx `^ 5sgedy9 CU<0R%! Cd)O)BLJk:7@g0fb^|J@DV0grD*uFϺ>y&<|.olqNtt;(IxgI a: "k#s$s"=[*rQx#JIZes}R4ƍdqM#54ęНjNT 1Me$j@AuMD)hu'ꍏIC>X"Zq:[Ts!T/S!YwښwS5GZgw-?? 8b~ʋ^-WsL rkɔVӸ ΐS*sk%Q2Z,uSE\Fx22x}̥t71W0cSF`PHm ̆ J`3zMZяO>z߹]?YY3O5D0(ыhsU\^ԭgtvs*ҦLTǵn?>֬hxVpVKZܬJ4',2aQwFT̓thoPYh`A\cXc*4?Э֜lm9Ea\~?IBp3$I֏H21@|3A,H|C_Y=|9 ^h W\^-{G'jck#(mPNE/+# )A #vvbP}g! C%wy $wHʼn@OMd ӂILp>!H>+T/%of\KciA QNLCs;dƀTN5rD) ݔmΑmHE 9) QRckyS΂%o-:=К1UM5]@Q 986YW#[MEY_ޙJ%qi]܁_K$}a;;^]^U(H@ /ҭ` wR"Yj? 9 t˴y,YW請qv͎;..l*^tJ5w5Z8=IY/ Bz*"= iG DnzIЍ%.OL&8*FǑziŹm S0\y L%W䳢*F;8BU^.UyUws݅/^yqg]/'G4}QK'*3#hBe +9} gyCBԁ3s*9d,biOWuU 'Č?^dzVit7]?m(1`LE^H҈.3dZofſeY]x[ i^3(R yn.WL2̹[E3Ѣrc&jjΩ>'t L$uDCT_mS-UzSR$io $Td&-7B s逸%oZVZR#fփdИF c RFr6 k@C=s& ʧ"μ#:PJ6;iTPG$Q`"m\. ЎQ.hb\Z@j.oYp OjzJ3Z4hib,4%k'h1ςn!nFkNph3e5xbe^KdRu|{X$2X*-NFu70N4Wj6ՙQ",D'g@f, $-S`d!lܸ͗ڦYǴˇU"5o++*MU%~Ů۳t +S.͏/>Ո?>$?|-$)/?]돷ixJkH&.u0ƺp QHms;d)v?jvNɴLD)$Vq4*#Nc|TጦZupRM8q h>@3Ý90~޳h^DH%[,)=}/Q>N-edգ)r,dQ:bJ9%]N ޔ`Q>?RCIo~xuNSrRk"P7 yP7쟠?O. JHbaF:wA)5N NB*~<],Hj 7友K񋄥:+|xjq _^7]Ժ"|x3k5T$]w~>"]=Nt6֘R+U" ز_QgW?\Z :ꚂQ 8evj"8^&1Îz;y;)8%(W7E 5)|lm;FAE_uu˱(raƅ>=F# !ޓLUpS&J uNQRGGFrM3\'jbďpYvMp)"ZUe'Ԣ$cYͺ걱hrt !k*XyLM߽WN%r9;y&'g$>uG,D4}G^vofMQX ,1^hʹ.u;Xi\M>el|_yoj2\(dygո|Vwd֠O_n Z܃_]ߎzx,o6S*~qu0y#Q~ZS}w#sp>oh3<D<*9'>jh =wr6 |-Xr82U吔QgUmԼ`SN8J^_LQʎ#v҅G I{\^j =%cHd;r%g1n{f_^?=ecRX >U9mJC2.h%f)\KHwn}7r q6∜UM%p^ 1TߪS6Epvq<4 qejڙ_cr=c~_-y-mXXq;_?}RoHR֒ SJy%۾=k͈̦mC'%?/ϟ^B1~G6 ~lȀѪ|q..rgxt8vUx{(wSvx)tw<d%{cL='7(8mD;KRH8'yQev¹̶6Rx5 661# J!@^Q#Ŗ0aQub,fPȳzg;Dza$V$4AUmJhYsk$ω{߳uxUXTS#&F\gi Zex^ӖAt$Cb$S5'PTe:D&+]8YGNGcL9,Pu1f!+1+18};`6E߷|=7aԿFm?ߞ?N-ZXhш\η~:o<-/υ|mldڵٽq 'v`'[YK|~7q TY~8Eo n$k S__/(qaAT8 n(sN \α6 0 \# t. @5,JH&S,`dbf2%Tª=#ܻ 4xGhSNS&H"& Ҙ S.$#{FADhblk61NN&o9HHu%~tC&C|x10 )0Y fB*0x~X[ ?ؠɉTRP#aF*x_&\ biIC@3.M'0=Pru;k~ޑw$l#Eng}6gg/i@Yx;CVڍ2絪:L5kȋ;s Fr1#1#x\ 'affvfX7&y*Jb;~a!IS3;9)gȑ؉ǟo1LT*ʞf~|Cy:%?rwfG wYjDΚ)X'قb[&[*YSY#D`Y˪/Oh4`6Z%V`O qEjd~UY/Fwqќl*](1:xφCb(FFe4+Ϊ9 NaMd@ *̔ܝAߪ iX#lV?v{=Q `A؞;EOyZ^'pCxnJ瓼\T|o Ce'K͡b"nh/x苰UQgqeY923g8Vb3`OȐiYޘƖ%=="%d /gjJx{sF%`g,ҏ+7yַ7vizλwK?uKesf:Hm,E<; l,|: 6@2G([6y6NN=2'ΈЎ#h:FصkO1o@fOO1>X~TCYnv6-Ggbwѫ m9zcHYudjvx29m:cmKOX"kx0>~S?qڍ Q}IqCY,@K/|j5c>tj~N0sdǤ-l;#N:ǷK0*seQIyl2P vᠰg]A5fEv,K|IYkb1.U-z$"a92Di:,{ؙ)US+=Jm-{yOp>/)Y֮>3gO_rpƨ}~rX]{"ybyCn9TNʫ$㝒{;C`-]2rɚ:9 4|58&grurAek7\"4klv/nq[F8D o'I]ڤ?/Vؤzqw\OQk;E:{p^]]#>g2_b_6 n>wm?bZ"M׷¶)s(O?CxCv$g]0,ZAԷ@Ǜ%ɺ-[=/P#+]wV{K2 M)m_2tvN%T98'e㗟h(~-nԺ؛|QOV G'^CAki˞$P޽ac0 G>-4%[g6/IYGY]U&"U#@" wcrzy;QO-M^\M 9H@.@pcOT ]p<`SwoZ_6]>?du&OFuv7s5҇SZ~G};څO}q:\lU*8XɳDx3;6 #*6Ga'J5.r%®=̦̯"qFpwMF} /{y b]B6P/Z%4_2`țP6ēmn. !hKO ;M|HETZRBC2D)X<S [0¼L/H DebB{`4Huhx̊r"lnBqӥu 5 uC4xq/UO [ xC;;u&n~CCZ Y3d{XX5bE]̪!gDH?3 ;Z\'VH=9.6'sƈ悘ec29`aLH؜f eWQA(-FSѲ olI*5ߜl*\Ƣ,LFojɕ쌂 L,!Ǩ&hPr%rY1{SR٩J{f:Bq䅺J I\ך,]U]Deۘ[RXW! 5{t!v'yf@4_O$ڷ _r"wpD hJ_`pp7 цYir|L5Sneni cpSܜpSė=,v̝9°13cd?8iQ 8=?p!Cń4=[;9V`Mn svNڽf%#\oc\o|O8zqF&oAE$~8Q[c^ϬxV ucmH1$KF3ce G3n!;_a|C&wF,~?j0\/CD&_)V`2SK<1’n,V%s흐b{*^9*>,R k|}, LV/Y9ʂ4K˒S/BW=pHy4fi=ڟK#&oŝ'D`ɽwfnkg:{zmAOfFVVa_>>"r|E{};<g:Bp'Ɠۧj‚{OnɘdFѪFe[քémքqzrǤ˥Fh@mmb_50RijT4]T8ϵ./iqSJ ˕,[=ϝe'Xbhd0[ǴqgO˱iM[vv1#~= mH;?ui' v]2{YrbacH Fѭn4I/EC6:032^y> dJ1  + )B^'|G#?!0WM3ne,apsO">O-ihɋ4S;Ȭ% w¸c%z:<2*9PAʬB9)y}JU /)+%>%U~R *_ gtAb\lEP1]21q"Um&d~r~c443D1c`|{@G#jކyk3$1gU ކWx"4cnlK="ꇽעQ޺'F`L85pLfHĈrI6}{RMvL6+gX)g4k\:nBٻC d!r(rֿ?^?٥D7C*6y]ߟX= ]6l?Xŧrvi%^|6=E9l>bt)Q_mn>ݴF?~"d*Zڰ4_/6nٷT50TPČ HLKxգ"'|GO;pׅ [~p![T,g(M6mR*(U5HHf+ȏN uHzS SEٜ%2!W8T[ۍxNJ -2R]:q̀HSԆJ%)%8jnFQM*T5!R0c(UљT\atN5&dNXo%ǭڠoY,T,b䔑.!6ر8'"g#KJd a EէhJQ*LEѕXT),,&1mAaouErK Iņs)B+JzpEKqBA78CTVYӫ7^wjH7׃|Z tgw4;Z܌bBJnSWW!U^O!Ժ2hMfGeq*SX!phfyCf`\|aO:(vl#N[=̇)p(0Ǔj\餛tj",̂ҕ-NN|S-B ~ؼ D[rA; ^x; +f:+-y]+w#2)Q[ˬf6z8g&?WdZsNq!_Z3XI$!.C2U1?--''+ɱh{|}^nfV{_B@הR ۥx݊T؈6yVZB'/_ Os40J N~Y3%2|?ғn)!h U'/a4u:c~zH=L C.Fb܂ 3SNoqnZ<ș%ô0ݞ{rA`toC0;EZPQdcЙw0(sCrd9{+yz@e9mrQ$j顭k _׹~&w)t7KZiL~Z-Ѱ90P(:Nɼ.IOhg˜l*F3𜄓 P91@*\G4$ .AzqYGɔ]o R3 8VtJyOWqku{F@묬_0{Yt VN[J|L&c4%&7:]b[Qq1sjldXψk1/U:"= 5 F,! fV=R^g$RoU:lJ䗽6^"1ʊIH{m~ETapppSEy]oT$(d]- hf%6xn888W Ѻfa㎩:H22J`5\HNwfx(g) %0 ;r15EEbw2:-pW\X:ͩ.B/( ;U0Pݶ&`Y˱Ӵ2O,1Ye5WM:5]p"FW-^ębc+`V~7bI7KbM$p)%z.KB8GE|,23^R.%SPfg"`J хBr*+Vu$Ң45XցuIs q_qފ$I@[MYhmۚZ]הC< O(@iچYT4,^,1Tx %D;a2@B-aVӁ2Dw rc=ICDD ȆCvdDAw뒡Aul3GC9Jqkhv)u5p%nwcDoϗ#(o3P@O@P(ydZaդ҃x#N]4 ϵgU$YMMJdETߑ#T@l|ܜsa׭ I::gk$-F!XHB%>y)8_봳ֻ`qkkN^ʞmg'_V~sf_XCHlA%F'ny< d@ўzǵN>CfNjMؠ~E:=}$焎޼p o>< iSfûB~*ya$n~."^>=9Q(0Q&%i4"0V}<(TLCk;wsu('ߣ Bٗ]!4h/`Gz G)(NmP n=q4mQTKG*o%ڀiP6Gʛ%ǽNqHB?heG(#DKl9G5ЂRr`O]i0$E ޣ|(J>2(*࿗Oэ^FcfZimzJk&{#@lknsfU qAZ5Q;OMTHRLkZZs uHy:ޏOM)X? .0@qPٚ%dP3'Gb- ٲ^M Ga Kz ~;.:%s%K[/:+@Ʌ=@Miݷrb}}66{`zs~pT ]d~(f9ݤUf$loTcK!aW6y}#^u2 (٘Թ =icYB5+M;(\ӑ8(M|S^8k#퉷3"7߸Έ#d|;R8)#/lYW@Ԅ iA$Яc\Z 8N]Cr.72k,gZ/}\E*$1Z88IElk%ڙZM_Z{!U&FĐn=J3I9氩?Ơ mm< mjC4sǝhfdPVcn)785h4*=nNvX,ڋz_mxqkIJ3]A'N?3~~$WLuNӫuq7U'v`RQAB-w2LڧyK""8',;y{x|^CL<ҺJ&GS ę:*XoM&R rZ8˜XQa8e&c˙Ua~+W-NeDY|•D%wd$)Kۖ7fSR7( ʟnGZIu 2D8u͌3x:L*]rЗ6GR敦Ttw'7*O7Bl!.[{ EKDTeqGS={%3n?%x`7a#+lRLx>؝0wi).p+>|wMÙ0u_~jL6ؽ׶%HFϯO3]n.NeK*{s-2FZjk|즲_r6(\H\Wub= W#+,HS5,T+Y]q WգeQDi=gu G/?xðxd*|V¬^fkKk9|R8Va 5 t0(H47F bҌxSa#0jq‡7`03EvIlA(Z= W!$oe6ϡ*JۇOSNԻ&ik?Vfo*N8Op< y EJDt6Eрj1ȌJi$Qu=+l4=}DŽ#ǽbVZ? ɮq<5/.όs_E*jMᎵ-P6 ѮVLe8`MPD!E=k**Ф\:N+1Bǐ1Ws(@qN^wyIh!]Y^=}`n'@_w1,ЌPN9>o; ReWN?/@hj^C/DF }xpV_|ʙyb=!AI&6֓+)g+ܯ۶'GڲR+T.O#EAP6ȑJ|:֎[%]:q|bnK)žT@JWOz|I^8TE4ќ- + 'KCo\% 9{:hd$O>4;m PLtW<[s{.7|ZQuwp^_S7)5#nOS:>χ!B1Ypΐ䌐 M8yEkv8RP5{x.,*|}iLW?gvqDY8ZykDӓsq;y8G[;wrɹ;R!ͷ]׭^ܞeM`~̦Gowx: {7{yF>/]u i zݪK#RIZh|""Z-@ ٕ Ri Fmx]rkN0K~y2[n #޲}qt4co Ga%ٰ䡇DShxozAm3X}3{(#]oFWY(e`>M|Fl^=fjJesdլ_uuUw=&=^v#ɅTWQhPlCIׅ*tK‰h09]?ez)~Z*֠܋,7^QseJG {Vb)ǫDvT;!+IӇ (K`^Sc dJ%|TɞI]iK#քcМ4 pAcHM{aXeU@TDD5`ZNbޞfQzv"cN4`FqqHc*ze;1dZ38 .Nd3Ku[S\#;TS\J4ULCj@~tL)1B) i,di &cޣq-r% m٤0E(&DPMJylR(!s+% 4 $jr3U!e\fI?P:O@2ՕI7’ȠPpYg^kS( W'0heql X`ao@5k@.F ͰZm 18/9&N8}9+nὤ7YMJwFr4eLMN!I z88.'L . taUWɰڶ3*+@>{Fcw5X'{ՄV-թԫ,'{U7ZlSMsc0(qO{$$]L *E}}A;kun4Y,Vy/GqEGs J\ׯf6:r} B[G>l_9T>D"n#ŰLxfPդ|[,É{:7_QG+ k sGj%t<ĞOPs\KjTiy4J_8_SUtaLR|s{ԢJkXW_f|&^7~U$[t&_g rmtWj9E(6/&AIWl0qƷK. ?)‘8\&"L"E$ JJ$#L 7!D x1LJN8X"IFb./HN"4\.{04KNՠXw)Tng߬]U|-e~a{5;)Nowٟ~;EuY!l%T8\Xv o~GA g咟hᖫX^y;ZAW ]:h<>ՄY-F`rTT"c-5ru[>A%wFHI欰&7ތCE9&KXLIY4F%%mprTTY lMf1Jsx|7:DKU!YUfQcpt-7kv$v̗e[%v'khnhիw(-T>K;ɤJ_L7{k<3i4vXXj#_4֐ ZQ;׸YpB^uT@dGY~Fa1S6`2C!״tD[ %w.;%o%;ɚF[{N_&g{d@2XSNY3G)U,I'hM\%*V ҥ:fI$Crig 26 v| #k7G \tF،,UH@Z3tfR[}f,Az|ԢP0)>&I?v.xe~jбѮk[h~,?>YHC~͞U*U =d1(C6O>a̘|xqQ?yM !Aq&Žr3T1LjP Ž@jiT>Դqۋt=6(98qxo#Z-Hj\Uc rW5;B aI:ĵdԸOTn/P|-EykyG[|z{C^K+L#DzGGުxx9. T0`hЯܦ>r0颸T^fTkuUդ|U(_G8_˽i}.G,Pn=Cx f{j1giU/~ۚJǟuY<3>+a)hێcF$"[DITDk\#o-VSDbMaa0Zga|zg3d (}h6vcR#Sx y!ǹv& h!ץK (ņuY{2enxB~kcK#),{C}_ՒdJJ `$rB)4OdL$ 1ڦL qp,ਖ/RgKԅp W1 %Gr8IE߷_\tnYmnr_ً-<b:,ɅʘrsGD(z( O> 4w[ICOielw9{>QILddںMEv#񴻫+d+]&_g&g|ftf{<i1Ūkm wumj{]S׵>wEUKWIfK탶Q<.FKNb _%\w6okҙqWBɡvѮ&ۨ6< l`Ն Kj )٪ZK X;MKK9 R \r$uލmAKٵhvctz2_f紪t1nk#ש{|y*Sz򶁹ϿVl?ƞyyތ&~LErn9VKuLSI{ |S!E}|u&y$cn6qoD.t2[t󯇱n!<䃻ONǚqn 5kKtr!I_{kBxw>% 5ّU0G>%v$?4c Œ+FkyDllR>Aq2f>m%RT}ss7Q`RN )r:)#%)*;`%cDe]Ž#~E8AފDh=4J6Fvj _UYUL̊52=-{UvU?^J׈?zajUܠnyȮDc- iZŤ/fƁg?7iBMC A9C]x桮|Ot4 j~<i+Q> @NO*NO8̰iu=M㆜wZB>〘R !eӎFpQ)5qАNkT$WS-sSFo` wC+G6>nӝQrU(H8%ij,fʈSR*)DBĈƄͭ#ai[1"U\mT$GHq9GY"% *(ud.b/ۂmcɜ(j v<1JqI@~i%mk_+.4@pńKYe_c#sNءg*{)1QR5sI,p7wI#e͔x+~zd\YyoOܻۆe,_6/]|l?b"H?Im,PCgf>>Qr>LuMڐ-ZQ047ouR{&v8ҖAkoBiQD9I҈G*(z:ɤ̥ћ+G߼qSkdf}@sĂUddU>98Szj9HJ*y7{|Ec뉳J?DTΨ È mQo/*[C==lqՖNN(2d΄B,MY'+q(VHJM*9iHL nuS/7/D*¬v]٫Y(fGAh ϑDCt@=ѬMlو8 7qJCp[Rхܻ o#;tboN(9 =xoݐ qAaG8YRD:"1щo֝[;gQvN6zow$9oo +T~ܴ`Z I- dmcʾwd)eTh }JpLCʔ͜#5N SqaTѱN@%Y`FL?UCOy p@p:0qc@$9ԇz>ѝ%y<EWa.aa0gT1SbBh&9K L"YPǹ3S+NHDR\AeB7uN&U?bRL)'%*!BQRdJB4ET,>&[9KB6 Nr]Äsu 7q 0o% ZlA߇~o<>LOƔH[r*`-\9/mDp zJiݣ/el Pȿ$Z<6~,t N#櫈@(RŸQڲ+^wJH Af׾PAkngRZ!*!BȾϋ4׋yx 9T2]_-aNpRLϋ"2S&s`4`L2n1si.6k]z ADQQOF0X']oLش@B:Ź0x-s$eR7OY|8T/x"",N%@γ4$g "yX[ DsS'22M ͓)I NCAnjR1`ћ (&PO&̍dQ+5K04,1N߼Oj֩NId ~>5i~~==J{)Őv\`YfXD)c Iꑖ!)!D54 /PxzH 3!ܔrf.@OWZ&w5f2+ ^wyG:Rԍk  /9'"`湹vPp }k 1p.o-RBi#GulW(#w-, ̓ Ebg;U9*X7?J7A1]-[RR;Ұ::~t='V8;@wRmS/lvنwz[ir׭ռj^xvms>tfځGifq pq9^W/E?.oTQ=;T-*ssDpTHjdp^{gs"7)Nsb=~L3/Gܽ爝9 ^IS /5E-s Wr^C BTM7¯,K 2z pfs86o8J?ai +|oȐgH@DLXK49\7.Z^!d6CBȾ8&ZasTe¦6LLՕ! )Doqv7 X`(h)FyF & Ҕ4޵5q#˩:[CҸ*OiOekӖ `JKHJ}jb4I 4"8t7"-XЖc sWj]$2nv TM@ 8;!v3 KN2 ,2DSJDJaD H"RъЪĕcPNG:{QzқrLPXpb6yL(B3T_R Q@mLBW- 0)mfѷ-TI)mbh90)mfX'/[JAI)49 CJAIiC50QJ/QJYRμR^W(_=\.fAFÔuaЀQ;"JJh G#CwšrIA$_Fdչq 2%9^]x϶ܞ)F|c1bunDpPEOHj'eIKߤέ_o_~<2 7H3?]onFC~ "@hBfXlw}iv!Vۓ2ERR0B&3Xyc^<(R&k?ؾyHJ󨧏`&_|9{‚HwVy PͶX9mzjm]p-9@JqS \]5Wqy_kҵQϿ1_5J ) n7mBEMU"RJ8 E;ªvR9*=`+ ")J{sWh Tq%4u}{!NZ ;Ke8>Z[޸U\5,a߾3;n5&xo"̧ ʿ :WGwgNyx|>7f=pboZ-~ndUA * o [/UbG#WOz^7ZU/&T0N|N꺡x_Oj7Rں<zjRzR  īBJm^zR)%MI\%5wyP)FJWK¥*)T`@l!D Qn*g+-L}oW2+/cϊo>mgp;FZm$;k>.sZwsPmx=0ΐJ|cM=N|;=]:NDmz-m/1t}`c:cK<$kw<6։jJYQ~^rdXYBz` DQQby[{!oZs͌RD9[gEg n 20xH%W[4;.I6 `B#<.4LQadJ0HS]bS ]/8pxJB :6e+Z+J% !(ocP[ik Zi !GIH`e+u)U#[mluc*f\B{RPwK(ZdJh] JXB۾$_ЩHS&fyӫOF咝ift7wʖwr,g4T }iۣa:1hK{|kiO{̺wIs RW1͗{^ջ]%+mK 'oam52*f`PLzP-3Gɧtra߬1׍˰p;NO%Mp ɓCO AEIk.P}Wd GD=\ Zad 83E@ THwEQp<{8 {Y`.Q=n hHWk+ѲTֵ_֟{p 1ڪdJReDDN*Ė-m܍.}$) ۞Ӕp:Gڀ:&Oa"Aɸ Vs %wW9u Yު D˹r.o4JyroY?H+vogA@!BbܢwAXa0_Auw΄~brˀ48tC u%Ae:pґd1ѥ0#M|˗ّ{yuv0r':DC:8Qvt3/RTq`wƞ""Ůw20V*=a!T} . 5\uUq`Tۯ_BR|’CZ >ۆnݸ_ťIx.у?\OD]R.IJ%G@*n3,)&RȰ\pf5+)s#B父J!#sE%D~K)~EVh)%$ʹ`hG4Î:(d##3q4zM7n!Rn߽u ʟ~H%]VDVK+&xBv\VStLt\M(t[.pHľefK_=ͳoRNOBmnVXw0s7{?7ߵlB)&SP\0wަ[ܔ3o7O1gnA;%V.2$tշ[1GFE[~GvίJ?Ζ{A/lQGSU j@2Hƕ)дTUCB噟Vb]|LdH&]ތ÷pi`mwvNk׋ Cx/ة-cj73_M%Ҵ4]n{kQUO S-Ɵj1 PIjURRn.$2NaiB\b)V&!X$P%`#|~ߴw[/8@0;嚷y1ek~-YB!^.)ҽIe]jdvN^(SqS\oW殚+/bPn CH-;jl5gewoUB&ĻTXnO|ًhJERd~Bl-Jb[qsA@~rYZ>$\5VB#mvlUf$gK(c͖]=8DBT0Ħi(rzO-ӕ,+ ֙RJ6m#鿢җ-n4\x.{I%/r$h+Όv_3p4le7DӍF7@<ԤB{4 V9C3W0Exik q\Tz~_Fٗ/ÅE3xDCZFi˛O ŗȸ9ǐO}PfV4TN=?#$zuޞs9͗]S&?1 PNK!ϮluϾb> Վq[B-EkRlOOfbGؗ>R;l?ѨO:Ǧ:}/wtS28N I@ N(Y9o!# ZNHi~V&H%1TW)ؙYmtj> u&khOJSkւSIQ()T;,f11jo A2MiЫ3v"AT`hΫpP9]{+.j{\)G0=~Ao=5.Iأo!F/5ҨRHG,M$ùrXq\B: Fzz˩{#'c:"qz+l QFy]sП%L@:f}p*\1Wy{|=ܟM?V^Ƨ0&CD9R&i*EivUzEE.0/s2k tj}㧟s? G_U,t3Ts۔8/f\ & D+vRS9HR7CHoy)Ύx]j~fps?rH]@BZ'P:Hc)d!1ۍU-ՋycAa݈\#^}(zV_r@}|&Y|F=xيBN) T_da5*xzNZq{#W::i ګ7׀jjÔzHmK\Y䄭 j*#ю:m;v$c7\FRg& E@E QFY@EBaSQx!5A@Hn4=~3C: :IMH0JKHT31DoaY`%{& CƳ3a䁝RWC6σȴ1,òP1MS(t]^hm ϶0RVI%%cªF} ﮗ*^iL Xi;V6}9_MÿU<_yuX0< Xhl7=lBGfZNv/ZI5]XYQ],e.}Ϗ㿴X'jV6?U;] S=\y9gI[wלgܞE lKv:gequ{&g >_Fzhm&WOJg#荱O oW""$ !S+ H&8ƧLDYL9B 9/ [KT(I-lҕ+E99$xq"'⡸"swY*7}Aj$kt;:Y(BQh#YLfp9 /W8r4*h8BEjIm0i!82sl+9fsV20̅Rr"+qX6XQ@/y<Յ6HÐQŧٹR桍)+#uS>ܖ.{U(-= 6;de6Urs* /5R=(c0p")iF9KJN #8"̜&- gjmەzbىL-;Ѣ׉sgO#>}P[ӎ'\<=wK?a#}J&}MA0o'oT9j28Nپ|S}vyrg`QF>t>m {@@kmPc =jM؍;o{R5&A@X'D kfsmXk Ɨ}sݖڭ栧nv[W bL7x#f7wJoD7lJEڔ0z&k~8. ;IqT꣩ eUs-{u;0Ȓ]#kyu)dfW&[ޗ"͒'7Jޜ}{(偏%˒NJŅ;?E{i>ggOye~+rY+M$˿\vرIT2Ί(i(LrE(hFou]@dCR ǔXQ?6w݆V8+U}f\6:͟NEc1]GddJGQI!lD*“%.JQs3d7zyd(I!8!!-/>mR^gCHT:IDൖI>CV7ձ@=+_ ijhPT}oUQj󍗯,NsʻOgIUUjÍu j'Ĥߠ"ž 9ꙜJ J_=DNN0!<q<^0Vo^Gc1bj}BLQ&B8LLs/2sZTk2YH,FPrlF~r/BfPZW]m.0N)Bk'j<]B$Szx4&"!G}䣩ObD}@'6:Ł͚z|_}.MHΚSgm9( M.Lj@G;*v;hz}o7~'<}7Љ׈t|' }5Nx.^j[ ED׫kyR纍 #9Nu:`YI Pc<"JW bX*~ ʈ:ݲD@F#RDa!߸v)%iM < ϮĘNo4n#/:m ݺEz>,7.6G͜wAtFv,M~hֽ=һa!߸nT$EVo]HbްOE^lj7־E5Y^I%;&R9edO$xW_]ͦ wʡ45hC^^Ҿې2]IժZ9fr#Cء~iz)*i?m@@سd7\S@AZlgc\P: ׼.r&4-KI8⺻({E%2Ο]Z%r_ =I~WeŇdvo쫟,wOiE ]lM݅5J)6|oY:KdphS)\ҐxgI9)Z/K|4.ZXLW:j#B(;4ϫnZ2 8b##B_)ań16u`,2!LpυVB2@}ul.SOhʕ^3X'd| G(s+1 _!R$!Ƣ`Ifd$B.Ӟ X@#A8#b!'HڔSz7kbHXֈʿ4dXU+ 'fkűFT*c YZPOxKJOQʠ&'7FZ m?7GnOy~Sڗ֫{b*D5ъ8N_}&~(F?x_s\D<9!:]ۃZ=쑌sɨ;]+1ztmQ+)9diǃ֑h^j_^{DmdDzتBFxyt]')/C_ y!<+>~ h fUcMD-6 HPw,{:TdyR&E4$C|Sv=KA^jI`Y2b1sƱs",@β޼"=2M T9(mS Ѻ`U,<.`:ƚ+Ub bο~ʓ֐$yRH'O,$l-1HZ[:cEr5ح66SKC1۬:1<8o8Qo8W/,{QV~s@ 8Sւfj6Riy$ep9eAп->qE[ĈEԈFq`/$=8eȵ9ڢ*G4|yZ@iGܾmm۷oۘcob}H +٫git>+s:^ԯ5jOIdu{]=%Z|PѕMN Rhlqr+QN@O^]}O'^_ vRMJGّ'ROX{}ެJ;?mtס^":-Aqmq~xPh56x9{qڨCژF: / B~u L#kRsiIdIT! vJ8]h/@ RUnU_4AT PVu].)<WŌJO5j2O E?PDmK~F̝eHN7?{6?ζ|? C. y4$x=߯(ɶ춻)KNĒ(+XE*^0h8n2j%'I1qݬGh"0 b:4L#D""*FN2"Xlja49 }o/-2lSeњ:}FN;iXDm3624tEHewԲm:jO^+vE_䚨x4a/r\6ִa/$ӆ &r-Do4#bFXˆ1FHD ".(ЫP#D*Ȟ9LbI(!HRCNeưF kBig]PB f5q^ݐk=jQ-c챒k6` #CB*@O8@II"#mJe{zanO{GyC&ix!}ԏ1{,,@sЧ680N9 UӘi/Gnr+@#4}4k6Im!?vm R b; ky]*5 Sk-EL<#x'}GF EnsBX"6cCBe(.\#P!f6N(~a $RO͕g>ӏv6xξK=<&k0g2km:gO"1ǜQy@clh3k&eC_3AAI 2V+c~ŬjT{7gCt#` xN_A|X?8чu~ &ҭ0;9lď`+o,{Rzn'䇽u+'T'TjV ptP04Ȭ .hM?3S6MS~Ƞƅ~{ub62L2./3I^_k&0ڸB4o\Lh`i,kHCǽKUKaѴ=Aq_r9䃶p8}Hu$x>L&\n|~~9w._bm&,~͇u`o/7ًa6_!ݾloGy!r T.UXVSar<:HQfR.̅j1ḍQY*L\)44~]52|sK&n NUR(넾Mut~Vh2kxz($n/< K{}ȮZ"ﻙ?8iGS{s1eYR|It>5F0&ه+WETmԾœX*d "U7Q]Wd~0Womz3_*c`|\1;d44_%6*P্ߞ0*mOիoO|~v??dq˩#%+6?a 79LfIگӧK G?+5̧O&gONx9Mv&9onwqŞ/fI6[Tt4ẁX~_ o_or%-~/^. caA |sͻ4YДphY>F 1iD!FP1<\+H׷o`=ͧ oQCE/YfY.'EƋ^-?a.<K)VHqMڡe_Ǧ8q>S*-=kߟY2ygY3hD}-^P^v ȩ$ϢFeMqlHb,QbXcrĆ, CF>WQ(BVQwc A[75X"7F{ a1%q"b4&1##D!uhx #?,e8~'Iқ΋dR[[suFK^.*%A?H/ǒGȎ"F m%q FI(TEar, L#" &misN-?z7Ҟꀑяz4R)$i2K%ØtŒC$h,T9108FA1!ksNV`W.|r_nnb L1΢*Ky/V˛/U=!8И 2SjG-]bgv E g͟8f }D?;?wo8LYəWżrƚT^<;Q 4B50Rʫ;S@дhD z0;C]]rʔ6c!ݥ|]m!(DtcfeOa,fRx5aiMRgU\d\*Ea͋Sŋr6N&R1 z~/C:+FC袋~4F#V"ʉ+''ne jyju˟홏]Nh/nHP>bs7{U`+T|~ kf{`L+Vx275a"[dRx=L0S6sҲ/_\v,{[(['[x9ٴ曩5cV,+ \ -v%ֿ]t{5_->;5:Gs(ـ[`ŻVB{'綗>8qGQL8?|؁̰M?"aQN/g瑏9[$)?<2fgi@ = ۜyAqpD - F,BZDfIL )u8q!w[&,)GDŽ`2~}L\BYaBܞ"LBL򑅅Vϊ%XQc8m,o֧]P-IrtUJG"yicҭ G~r`rQKA,$ / @!zMx`՟#aMMjxĘBf?30Swgmxy$4WN+m18McL8^{c6˲&_;Z#nVӏ͖`Њ>S7a K::K6Hc&ʖ|!\mʊ)yNkusBٿq!6Vٗ/ 00B}-HGY؏&cބpL՝2Cs%Bm_7 hB1G?VFD9 460¸ Ь8O|nkȷE`C =h1IOً7}=' xԓ%h>u^#0SȄZ$ƖSd#iHDF8JsEF&#hd(7p8~A*Iц'=[6NMo64,վ'!Ex Ozด-RzPt%Yk.K*v.2ôTǓYa+[)نܸuM}JlRW>Ϸ S%wpO'* H1ݶ#pЏd`: q?NT$|jžYY;?'+{IN0xLwUn#no8f2 |]:fϮUe5+m/M{syy[D*/,B~,d>gO~3r>f&2@0 ݘioDz5lLL82V>%'I?_4 K L6?u̓[163g`:/&#EQbKV,1VmM 6G'FuPEܯsjnh7~vǮx1jwG"\OR͛$FpoNj~(Ly'9卽$-c ע#0yOb?H#ݿ mcG/F0@-V랊XqhPq)8dqB45a'q1qbbgY]6B:KXe~>M0Ԓĩs6d 3NeN9e$BJIQ% 3k$͋/M$iT@Z Vd*E~щ*RzFd#,D9R.s_{u7svk X~j[moi(Zvi=`mb=g;"|0헳ܮ,f& Ly}jg{bx]~Oe}߄|U~&;tabJ}%K+)vPt"͙xO! S=1UKL9cQuA tv;`Vڎݺg?Dj>CYn(vʃt}G6.eZ;v랰@'֘ ܕUG/> e|~1NكpVԄ^gl^ѫ"UBy؎^VK^bFEdYE8,ӈ6}:<:Gt);~Q(U&[WDKT_թՂgJ!zxn!5T56̗NOfejԢ}w>_׵S#'FIk~tE$\$3H BeJB FV]:r!HaL"rqi K_1Ywfǟ/gmbu/>!z̳PoT% Dm8y_r>}..#PĂJz(>>W2gyf$DBb !5Vb^UZ T,v2= !eJ6VNnOqFImOȜ;T^`2M|.BT=$ħyȓQwʊ4ʋBfm-n5GFQMeTp>$q?[m7֒pd^} x7z51 bf3ЦB' }!:/P N$%+%`7UE杄rKM7jwktJ 1fwB;@ jh~Xe@F&p() Hӄn tgvq"ӓH9 bK榷= wGOiԛԛ8ոMz86fCF/7$H?f$FE,NK&*7gerR XB`7vMo쳭EQRoL<k$z#s&biV*0OLwhIԛlmtn /S}UZkmΗ'}) PaQv(Ғj]=v<Q'!Zc4j-Y6JUWG@jcҸԛ\"ԀI.,pDd$$0B-IaUI|2%G,J5|ҫ:Tqil9u jV<&u%8bo?M~iq^kosv?SI$v=Xk&zX x67o_Y,F?}hp#8mAղ=G. |JTGٲ}@w$H3IJύ~=s&0IX$LnM8GhKM%SS bꁈ(Z*EfёD8Lk<2zi"6e Uy.cjbWVIj<=>;:A!AdVL}.}TI/4SB$הx|CC^ e.)9PJ Re'!Z sN'!.:P8Jv7.<񃺬 =7x+WJ|_=s 6UB+T΀\C|_}vxwyh.p^un7_.Y[q|}ss\B?!b(+BMZOc x]-bȪcSl5J`bKハt}Z oeA̓ v/ˋ-.ʋwyr~[C 8#L &X^uYU˧K0-A]nBۭZm2{Z/++@7f_~}wKˆҲh']ʛ$W6cql{bOdKHsH~4kOqxu u>Bh 7@=+23X t-;˽ պTOjUZwWQނvW`F=Fm@ U5K,j%d;C 48i7*!O#jf'v/esr#;b#%n,v%?2$zUbWE 'd4V~&Ӻ1C^jCe[RY60.ktapLT }5. [T8:JkT8]Q%YT 6Zz<(NaB42x]'04ޡtfBb[boXْhJ8- uuf\K 'u`E AJK+PV;ޱR؜uiWlPN#Ľ騛iQ]!\Лn:SQ!4 :%Uw6E㵬'x*gN'($Ġ:c1&Aj Xɝ ;DQ37jaME2Ky\uZ̀@y&cc(7Z{)XAAwN"+*z_t׺v|i{]q*26B̾#@gޑYV;2.6W T&d7Qqz\Z :hbcNYZ]yAu9'XQ跷`I9{>?^*TBoJUB\ 59Oq~*yTS(mdЯ4#PڼΖ]j|:m#v:y4ѥ,e!eAAq5lv}9%=URw yӐYE^4n9'Xi&uk_T b `DgR6P!d.В=21.Ċ+TT_J(wao3DE?O_燛Y.3Y*ڲ;~jkH*~lkբ>`#LX[hʎuګ:|&^P3V. `lohŀ&]g PR摀b5ܡ3Ym)2>RK V^(>cPg(5_?)㍺Fbh=Xi_ sd27:")48bXw7q~`%q Y2gM|9@ S-b5hl}HpHpKv헳%B8wxtrb?=L@I'_~.XCp?]^蓀N$ڔ,ݼIsoO.SEHξYک' 4r)3'זH G!WGշ]JwC5B[c ͜{t e,a%WAmP%Yk蠉Q3-\$~bK2S8dZnJB pO#̔!&(旱yTQ$H8G$.nq1D(b5k"A P6ʦ1n誶{T觚J-vC4ĺ?ͬT)p!GXY#d9Nﱟ}^܁Gg+bD mvz~"`/ dѢ"UOŸ @f=A`D{VB1`DK]He^O"I|đXիi${P:97 .P8b{܎E`(pⵊ[l2PO\jGZ;Fd&PvCS|pbh1F cK<)6;D0VgY[.RH4 mK9ZX qMDP:XFZ( :hW{҂vD BU&T ̹LNYMC ? U ZD dmCmd}R@ >rYž(_{I j(Gh E !A9au"ݡ=DS(OQ̒7G9B "=^ ng_^.y!ȲK]\_Toi-ב3=KGvU1GT}&T^}{WeZ; y 5}o)Cd7xM[OD4=OTs f+ks*Q=P奞:y?%$rܔJ܉ܺ;pcޙUeooޛn*МCtl'(~\VF{}BJQ4_3Ffg(Gp޳Kanޞr~| #>J up]ڀQ%YT mNNU<)M)3E-F-.JQhWO+<>JlD^/M0[9Gwas17֍lV 13΄tL%\x3~|HP9"sUjwz8̊ͺ7qtE·rW ?W7ϼp *obiC @Ǯm:Bp%zTT/WKC^]$Up~o9n8PLmR 8ΘMVn^r^q=gEh%Y8Z@=0"WAV #Z}ȷsdMg6ĵ e:F T8Yi a"s.(ê︿ϊ6jnk;Cxj[¢.>Sb~Ի"`B˪)_K$Tk~lo^xV-šu<}@ԶW6y@;yG9񎗯I L'U[5[71j/ȣ7t/O'yn8 eނluT#MGoaPjgfzSԹ_Y~tRy.B7NN`f2F"?d'JJw?p &~\cBf7fR*Oo 2[t_T}J&E=/MMo/έXw7q~~)xYZgͥ|9j=CksUfqékiDk'ѓ7rJN>yu*reN'˛ Wb>?-+gz+~y)%rO#~KՠjiA u (l|.K0ĞgPTSQN[@:@Ƣ`a[toR-l#&|hWjd+6ӗ1'ZP1р%4rƋ7FO3 *ؐ#˔#1l^w{Ѽ;-/oȷw(b%w}k1xϤ te=hVdz3x'B}I$֚S%ygc.Fdc(j686vg߼SFWRbd:R R:" ?~zH}-33^>$faz<i<AL?_cv-W]sؾc&{89:i{-# L}Ck [tDQJ11O#|+*+BHhid`%ѲR+m_]R%F6 1;8kpnncAF_|o G],-7D>U M[%]3빵4a/^xϷߢ_O2 3@pq"pwͧ@IQgt:#@ӯ umsfG;&G0?&"4|"SvcY-`.~],qƐ+|B{t9\Ij_%nv=4InzЮ .;n3{iU2g.z_cT}]J#ѡ#{?rEQ$5ϊ٩?Ui|{U?9ygp-uZxYAyxz"xUR95FVh̰jLS@l>H%Ԟ\iU>W! Z̔RKҥe)DIJge%Jt<#ZPLr7ZjVxE\Y"NQdR=)б2JTɼplKAIKEXڌK9ؼJYd%&av&dx^}pmgB?Ki/5S)OߦOԊ1RQ}Y`\-J /T;kygԪ vii_{u.bJNo_8ʇ<5MA p0E\*+$ *E(8 [ڴ$u, =z 6 ` EJ5`ߞ,oe4Lj7&ֵˣ]^_\dYO\x86VNzI9ѿ +*}D_)թ껲e{ w[Rd۶3%M %[>Dx_` n5ǮzUڏ']`Tx{B1"ƭh~.hp eA%--cXccd}gjԏ<%Fֲ?Mdf7ϑĉ8 ԯ Gvx9[})36t˘6="~ ae]|Ӹw˕˙3S:v\Д uLa^+ٿ`WJ1ݳ er'8{K?^hfnLR_Of4uM?@y EFVy+KF eq5/ cc YRlE ʖqOFM*fbel@9I18P%"WDCdieJbڠsDS=*oR&/l y.@P(V@( NH[*)SM$UPN"5)sC{0,l=O }#)M*O߳1\<}~TA5znO>W0Q(!!Q9Kt{;㰸P T]x?:FTm?1q%$gW%<ǾvVE͟wͨHI*Z.R5s/r k%&w (f:\$_V8~)c*}W?yv}u]4ݰ36c7_s NMj5(*{Wi4ϯ]!Q (),b+p5pNATs'|=~l,2i"aL@F Nggkw_U,Dlň+wD2Q҂m8WJI_1rE2GAj ݝj) zuItu`q:h cޝY8~ *=c0çtwvZ Ro-}e p(Y.'rc)o:e^ZDGI.Sc>(Ԍu/ ) 9sP4nFzB$AXmR+" Ly$"G r FIN|ԁaSat - Y,[wEgaw(P5n!MbΏSNB< e(Uq2nJ! ,}KDtG2 ӧB/|{hB'ݜ<)7jYjS^VW'+RWW&V1zE:2>.ᗠOwV_,~m2KMzxY XCI:s\$07Pvm_$~ c$J" pPLWW:EڅrԶd)kRDÄԃKi .ʡ`qW25QIӵ5 llb(x&ŕ"E}Hɽ#jN&clF4̟>G-.1Ž6h[!2sq5bvsG\__9GGVbKX%9G=3΂E;pRhԚki#PH( 1ugy<%JĊhO`ͣ=7RD)gf3S\$pO9f) )+ញLɑ,Y|FwWEj36ϼn1[@֖p<&mϫQG#Y_ur)ӈxCrżtsШdKG3P(7*;3L d2Vg>t3p+?MFk:n!/i|{w'{ѧxQfrYZd O|JX[bIɼ<߳ 32.Ɗ.m>•jW1h`ՆXL0iTҪ]w3I uV}1f_^dw}\ip+ߍ~D)\ѳڔh)?Mh}y}quR&$K)ջ=[T>QWLYfiTgXP}dU+"T?Mtp좆ץ)jg3׉qYrɳT;cݸXwpmr_N͵y?n6vAOeV~uj?lR6T#ZjuG2Y5$|IμlŔZflzٰe]j]+qt0AkZ9s>d(OBDXw_j虩վVS{}z7ME.z%64RC9".HLL yM{ٕ, AZ8N=J{x >f8gނ@ P.<:G<.y{Ω oB=I>'=fN)eJ|cwL=92(\ d XN8*H$Cy?$i!" ү{#>,G6Z1M,46|n\c`aAZBHLZNV5^8O@tc<1f_yS8#oS6xsS0DѦ@|161YUn='G ǝSa"F ^1Ӷ+gCNh5y[r:m\apg;ޒ;oxVzO÷+]Ccm , #[tyO}o)zzo6ꏖTP=w a&E[N8eE]` $N'Qx x3p:d[2;I2 > 2VWx"aё :F-UDri z  -QwAS.H.H}`8y^x-7rǝve8hgt-w kqc=f%i==A}#SߜzX3oS\;[%#vz3(@VstsUߑzVQG[l?@rN)wf5ˆRӃ=XWWz\ٲ} =*[9jhnGl9qYҏ8#? 5Bw iypNʏnGa8G(Iz1u%u%=P -yNFbpiƛBF7. x8$Ӯ$LXoLm(ӉZL~߲v`o}&~Ʒ k8b:c!iiw/> [+j~<5 Ν?Ztw_ uʮmR6_53?Ǧ¢Fg9Ewe!ֹ_YݗXe-oɛXz3br磟g닳YW˷ KwbJwZFS`;wzLRds!z*:Wd"Wl%jSx&ݏNfR{0%@?9=)]=N}Vaz=q؟Xs@gww:0(㉤_t9}!^{v3͚Rmv61RؽأॄU2oR#Q{~i>oW0zi.>}tu{#Lਕ^0qT=SkI."6{.ma_^k.:GzL:82kٹJ^͏-b{Yv"v=yo-l/wܸw(~A )+|]C4&#C6lQ>m\" տn-BZ"7ܛEDVrW)D<@V(jQnVS۵x6~:<ϫpM5gvn[ړm΁=ޟgK3][v%$B`pl/y;/~} #yug̊ílav}M{W)[|Ƽ2rno7ү.M7PO9K*wu/J3잾(,"!ߍ lh;$sln,{+Hнtxv5~/hKgEc{1C9kʨ%86ڡIx][,CB=Y&v}2kgM+lh!xNN_'.=)OrUL$5=76k4eQ8 "eJ.!jBLGFPL-hnĤ;ĄG^3ICdaZqa~ȿ l6٢mZojJxAyӤXY Nڂ F5sF@ὴ` `>%^N#׮&XVA Q9|h(G}(ԒK92bZgYNח,z =_*Y)s*n6ǬIx(expwh9yPldR8l݀7s͵‘mBƏ̮:M֗_&{uLx'ɢG.yqENUJ) GW=w@&". arbXy 3R*eC飤6fh=].џ ,Y Ln#FCdL diO6ggsЗCi^# TJ|lWO,:hoaL>'͸r%u0U[;?PS*B$ PXj^0BPR<$Bh8y?W18_ܫ;m\)24kWҒm*v1U5ϽD aǎƃSl77UL9Q`ZbЪdZ}~qƓ}K.@$wxu\{) *F$䲧 Jx`7ZA4FK*ex%iu]߂ K"y8{|Tm>W~>r۔o]˼^⊬QL{M~ssRdY: *5yP|,u§rݴt/72*Ii[HJ@p?1PkR*\I[L&|f !A2K1 Qp.sP &_Aة>!pIh')f-:76X}U}@k%kM Ńf5pj}MdzGi7\KJQ9gF<99HX[Lt$b<(n';߽iDpr͌0AbHȂt)$"a2Q"Nc 'gl,p i1Y8n$e%JA\#OѩwGFW%6ϧ,F6D->3r Qd?,saaM͜[܅a[jL&RV̂ERԖ#98s\QGLkQNjBFL-໗*vdhCp.*Y )!^kYI[i:c܋v4 %G /QXn("B;L!AN._]PN3@AOƥqs.uJ9DDz" YXduڡi%4YS8rGC1+gd x$/53a'SLcUNN$&^ZY? gD= 62>]U:sVi֪ɸB)aY1GPh:ynl>wPl_w҆`hK$f1&QEyoMcW}^,#.ywi"n"xFP%XZʤ@'CԸ"mynje4DOβs֍fr` <`<ۨ>}H",*+\\J6c8AW0LAҨ^6H^?FC/:uNuPck=˷ ($:*6AЅ 4BTD_a.)8U*8c#Wэʪ:W)W=g4@/aʹJw۠19l(7YH8{nH=W|!lLmм~o9͖YVrtvݠVè8:̅B`7ݛѵUDIƀU?m F=Lt ;f*9>Ru%h,zC_[;-0nv3X4& E7Y/_ح2AMW-K. TufND]0%\\04 Aȫqw7):}\Y~ =xΩͻ&rt`δ[R<$˓IɆ+ p6v5@0h!Uh 4D:`aSLA P޴~8Sֶ7oWQrCV1.y#$Z6;AV{]Nq4$!&τH$|XTH^l#62T2^FqgO3r)`ƨ/8C?7>MSj2(G]^+1`C1+D:{.XC-&cwC6 v܈J>hy}Gs7j,F'Ìg%)3Va X)6IZ#諮$t`YzycRxH$SVXgI;^E }-vVrM=oIm_6ʸM{#u.~ݱڽȉME;;ibq1uўIc.3!e2 Ubk"1[X t:l)k8ۓA\LX\lB-AVS4oݝӇjvJ[b\nD-ʷTZ(N>p7R%P3W :$ōҭ)? .܁iQ hhkW?KY9 6)ޯCAwh+̄*qZ99qp!Gł^?ҵ(d\>A YBsWZ$ ϑ(к"#zX$R=cflv!\S'5?c߽BaBNl ڭ07~p=#>΃Q;KֈE=OϚ4Mrg%p2>)^fխ;b>]^]-_cIy;)2tFxvk1pB L%@i&14mgtIt?ߴ&DRk"у b~~6/2ڿp 9+_JR5&mz4q~^d\o<ZqvFё`q"k16j\A S_iMӂ}Na'P T.XCqf)[|P!ӹXCE ;^KYext?$)F-2NM~XcL2U3a>ȜnPoH=5d!AB:h[ B2X<}.EP<mpjw|i^_]жLQ Jw} _z?Gb;4 L_>|Ӹ{9j=Zz+\f84.헿99ܧSd<<~gW~|{]o~D(H d]W2 r@2qk#jׅ2s?֏/sLG q=hU8}{Oa"p.M&~[Qs4lc~' Z#NCqV+6s_tpyƟGvN:/mGmunŠfiD▬ʫQ1]N&y5/㑠%y@sz+A8X}2./|:e ۩>}HV/e ϼAt)œ|?R->50o5+[jf5t(" WTE}4;Q 82|ӞTS0͢YeBP[I+ A0 Sz poY_x>eֻj\v2hx`&2nrJϬMڌwD]NFCܳ;J?Oow|,ZKQa6f&$b|-6 U;fp%30YPRUl~F%>p  LEW}"̢lY]=cn1x]Q e tJΰ7v$2k=KBXy\u:x ^ n#ϒ_L4WCU+rFޝ+)ѭY?M CH 1Z` D/b<|̮.~!|UDMf39=k KQHUz„l;fn-ayQ}j>3>Gk:ƴ&8lէ6bW}YrvLwbͣm2l~(;os6fO7%Ucyb#ܔPe'w @O7GO4glze_v? @L +bEvE٩3:c׈ %6)zp0fe4 SRt qJm᫳J oWZFm8'C#.awT >r( ¢pVʔ@I*.9e!IDm6f8*VfRkwa<*-')(:6p…5tvL(-C.|*M#D`PG{* Dt +v \}$>|`3ZvS1)AA1̔Y%An'$p_[cL-`\Sϼ%EP2*bLbЯ-lۇC6𖪕 MsqYk-.Pbrr(Z  x:X(ڰi北ˊ h (D]o9W9Rd0. ,&>p7}MM$g_%-Yb͗Ivwկ"wY͎JdJZ9(1jUs Oqs9iZPH=YfZ5۽8KsVq@aA6X(158~z `p#jE wܖ[f5"q8!¦by@ P̎aa NJ14^R"4#FȻ~C T dET4cgt}SW1nqo{>_QorgE) #7(җp: L:=j!0k1bYFv( Qe-@rά!mɇt@}"D&|lD88mRX'2~Q ~n\ XF"XBT912TX@嚲p*Sm3W\Ն+|Թ˜aBK11Z(vh?n-%H7QR8w>=W*xP֙B˧lZAQ6ƝW ڻ Pc~.:Hs>}<6s>(ƷӃѝ_BUbUV7Q}L`xykU&|$z -Mɗ2'}8b3*|}}~`b~wm8n6v\GϚY^jyA4 o>/otO&w7(j Ķ@K{$G+>ʀaFwRgBJ%|$zSbqPg(]@xg6 c[DQmǟh!-L;i}ǣs >aV"D YVTRRtWMXZkqJ:n-_ðWTcw/_KeE\dNhኄRSV{Co5V^ʋWmS{z5yI4% g7^P[̓}PyC% b6ٖ2Oh& .͖$BwFק@l7o9dI@vHdC]L۩99YTiirNg@fmH٥*"E˂N++)RX'd\B cD!j@*pJ@-uUM!e%k1/ 2:CGU.h|RD$Iny*U^Kg]( %Hx'KNwPJVK~(g_}'^B3 S?T@d_}Uj-w4%2u"z($ɿQB+cVa,Y^)3[4>A!lsÊQD ͐t?ۑ&hMn:! @C({0PBi̮*ثZ14y!7`D/QvP7m1r.ck2v`Z6tO +(o{^4"~_y3Y&~_9g_n_Y0%t.Ʋw=~4 D0WdWsqßN 1#רs |x]%D4('! ~6=?tw A-8K;6$ {h?.+,Yg9dDb7]_H7V4 'yzM= S/_" I01~E# GӆFH{_M)q0,'T+dN'}gջ%4~L)}|MLqs2iq^eM-u+_Mgzw?V~zw?}<7 Xn-g@l4_#G[wѦV2OZVc%7BkeepQ[OA3u V˲2e^blQCɌ̭yG  TFoƑ2Jxb5o 6j*b8!q2tEit18;ER1;kt2TG$SęCD2dKNBy]9fcz > U8\Vb]eklrW2iݼLé]efdUDQ8UfIWfpbVcxt犾MZ ^\ lMwWuSΎ/v]ԇot֙NTZ@*ۢ55Ѷ#ubj&j* y]{6"a~ήq4 ?>gOlrg]j]1kz3v{[7Ǘ[?~dz0v Ȕ~)]t>86?E;3e;?r_IC^vҩZ'֍;&mlpqg֭{CuCC^.Tb!wʴϦIU+SXew佤@'@4!_qΠܓꌴSpv6.0>ʡ4 rua9î*8Mjj]96ya؈=w"Fo2 h`R3ƐҶy(uO1J.ޚ$GgO\+@ic;1a #~@0 zt8$>z][ڗ:_&`6 J=eZZ딻:թ5I8gdCc/T׫ޢ!eF@gxH_9 9qv5 ՇyZS!'TkCWw %MQv0$ƫА]tʴm=ua/>&m~W)13ֽTuCC^v)8ZzZnNo4n;EH[v֭p]ҩnc!jsa]\Ô75M$ľ?hMT_,S-B)2M g\)R=  rWt/?>-Mÿ=}~&\hq?_⇻8oWsC_ޟ1TCxӎ'ꈻْ#?=ǎw}';oO>rOT I`<5.@^'Z lr$_nnpXzZ sTSS4(A9;[6?/DޮEQi֍DV(},똀DNY\R`)!hyk+ ha9Fy!4gafQcvc)Jr$ (睧1Tm'lQ濨ww 9+ޝۋeqY]pi! 8^ۣn:hWGĒw= {c9qǩ*DEMq hۢydl*VH2Z6u!9wQs}=N.dgbk = bNI􇛏^>66j8:PvS% վØ˒_e95,6 5Z?Be. %+tuYHA!05Uq[{9gw>j [rg-DPawY,yڻN?|VN #bL4=Qf СBkUENUo*DP)eYI^}_2?@1I7X}ͭؕp[/O,ce2Q˫xEGo|XZ+zsҠdvẃ?yhs_ !!p%S݌h̾<ޟ;}q62=%}g4Enk6`Õsu-L=mnSno{V.&4GN:4Xޟ=#?db͋hFȜo`H1hWY{+/k-Bș1X ^T ֞YsMdy7^?e^9=mI;%l\>k?{VnӔFGb-U4[*+UAQɇCAxNf?<]zqu=nG%,Y)ŒmlD {~;<6z@lj%&vH L2x BLmV[}rW: Wz. 9okЊq2dam5h߶a_\}S$pxCu73 ˽lJ %(J@| µ;K6OQ7ܔS8oI 87|N'q14,GN*'xP|k+i?}O+w-*ts3EՐvo p2c E%rXY"8sqD![^gT" ^[GD.@(O'|Z'/%`B$I#=.Sf4IJgQ%mw! $J J:NqT0|&)olr;jI(jr6.!kWahy./6^,5Oi3wBrkEF|bjɑyÙ.ر&'K6p<\# E0em447)PXG`g*{,F;JAZ-@fH6[޿zI499%& ^c:mW.'_nM=q kL" h)ΆeS2RL=1"3"3"3)p-35Dg5cJl#ydZO^S|g/xz.uQ^uhҐD&THSiItB+)ASbԁ ~هZ|#$ɷͶ6Wv./=jk-jcDmr]: QⅨiժz{eӣn7 h ?#}sqo⮀JB:޻w|r- y cݫ]0LEIg%(ă6/ܒmCu Z~E AQѤ5)~\~s_h(hkjߑ h)(RɦsڵW'g`h9J--6[MdnuKVH,'M*7p,= ,=5@\r*pn XM(H2N\b?̑/;7fOz6T:7`%;5hR6@PVدW^A~ZsYch |$qY`80U@SJo7H {szh57%ו*-1M~vFXh!u:%Zq /VcmXoJ!GWP:>RB^/0feyESlQ WP#וU{ v֗yGn0+ЅSBغ F=+Ž8xwVW69 t]X̔R&4}a¼ (C]_fPWpPۍy. =[v1v- 7-7+lV ϵpIT~LKign<&&lyfc.8UAɵX*-ȏ ;#6܀͹9Ii;y?V~ NZ.N%go׀K4^lj-@](!8c ё`m\GrZ%)A sҫ!a>I1HQΒ%| F򠈦3j=HHjW:T-lj'vZؾT+'aHּ}ryrvr,@MCPxQbE)62S(Q1%B=Oq:r`JTF*Pex80xztu @aN8ɔ@UOԉ{qT ]U&c Z%hQ֜+Wu4myV,XTnҝ=x7w3{$櫿zy6֡ ZK_yz%ZIznZ=Uvy~A[zǼ${yHh hJ3oa(qj|#X_ru`tsw(_哲Yl{ۄ\Fݻ ?~y ~w{;k^ Ӭ-ؗ40QٰFlfZ cџدVe&4W]5X"F[oq.͖w<>_fYΨ˾Uך2Rxģx})Oى7¨;۲C@0BZ0RN2E ^L^ %! ())!14JbʬrBZMQ RRKHBJ/#G+Cj3L"jz/ 1ӂ*3\T+/pAv{wͶV% mjAnor.G(EDo/i|O/>!CǺ\D g |d_0e (%p4K/yjhz^L%g;vj -;Fh%v?eҘ$ɘӧK}zknbס_/G_9ۆ 3hlYlK8*S ZVRMF~SCWF??n;l`.gjsg*vʻdTTZWvU/dPs3ξY-fcU-+Uro_T/۸h`%b(VFJ:*οPqoiܹ7C8.;=\y[C~GwGynǘe%#\̾K')ݧ `fbrlئDP`+$(Ž4Ĝbsd3.ULDimRE1 7[o7MM7@d'>C}Ry`Jxf>o֥nfBX11ycԪ_%RACś2񘌄.tC\F iJbÕP?sSA)r.N{|hafJZF {dޭNW՝eJPw1뚲Xg=sje*k^'%v2 1KCf3b; N>-?>wϲ;U6& Ѹec-^iUNq !{W5$ZTAQP9>\[_U}{@l<7uSTr(t$knőxa& x|<@bΜ;Z\߱(C/Cc`5 ~8a$p%x|٬BԗZX0 Pŗj[mI͋֓s؛uTnvbw/ ⌠i,N|7Fay7})R?/qJLB$`LRFp;8tX[Ko(٢)4[ÕXS3S!K~{wrA1z\[_<CV--h$>f> nQvi 0bjl DW@ N 2W*z39P*.ܒQW)A̋op2Hx:Ƀw)Tnń2%{l}NP:{A/Da2O= (#օEb>2' gަ{SS\b Ubn"Zbn^'O(:ÈgUG'F(-H"19nZd?!37LA4Js4_=  "h,jfUrVZ &2J>/<Xs&hyy#NypИLpQ(+gβ),h2.qϯu3?vz=[/ br&nTlD30o/އgG³W3~辭rQ* {g0BE*9bk_n7G9{9nnv n@!lXrJEF:J:ؑ8"80axe2: _=BTSё {ɕ_ 7]ZB#jˬJx &`) f.ǖ #D2[iġ笝w AmB,Uguj ,k?gtf?՞s5&q}!@aﳵ,r6l-bνėãj'b k[t vً}4b#3LO)icX. zi)6qBB664`EG-X5j%#s=f%73M3p9vhyk>Y/s?%L=F 3z4V uCH( v5_ ^9'QoZq ;E{6#HQ/hn^k!6J, }v6tp|:0WϦ7&GR)ewjJ~K Ⱦ =j{-^y$MϮ * ّ)+c%Pn\y r3IDKnznBd)D3EXcF!gL{@/ݎj>ρcQO!S&eLA%WXL Mgc7C­}/L()w٬޷"a=>f@lVooYw%_D*g{ ^>0K69/ `gSןqy\r*mz롑&K=| "*JKusI‚Dx~Id.$^tfD{kk:)U2"ŷ7~'CO/u+{=v QXl]1:;}`϶3i{HYQ;jU1g1yRҨdC:dqMy|/XkJљ.!#fz(dnPT "9j ɜf-wcښ۸_alvU!eySl\q'ofphK$>[Oc)b8CRTbazVMu5 Ñqn D#$gZ߉-3={NZYp_?)e!īglRrhCaa,ұT/\S'` 'w\JD' 8(# aٶx!L2,X> `jҢ]~ףr%`؏z& Vkb3"kY9FFT*Ab** /nSx2ޥ)j;uIV[EZu6z6t+Յ gxW uCIb^ ?,_#]`9aܙy{SʞWڑZȗ$0{=F:!T{>Ez:RW{NWUJ_N;vo9?wO_C~_9x(p= 6Œ<ױRC˰wf8k2H+ .g&166b,OB.s!ɸ7$spDy,m4p X8WLa\tuŖyf˨^ bǗ3@W$K:Tͨ4F>@hr`>L<'nfU})S l]~8 .-[!'{)&Gc9w]5sf>NK,}(YhZ{ciZHiEIE!#$( Ҡut eX0gjSA3jUƬh"FPF/An㥭w~$JhvW$R;n^ ~_~5$ݒ~%M!L|k]ϺS0C0|Md:[h )ܫow1)-dsMgwO'zxXdS* vF a>a񴡒o= 0^Q [  #!$$1$ ,S)$|z֝/}v5J^¤܅U`:} n,6́J)ڒG-PbFhC7-{vOf  ~2SHqVҸV|^3=ߕL˒DNhtzr86+<+RO#;粂ZuQXw9jp" J>f3:J uDBH[Q[l?C3\)<ٰ1$gp{K.$|';ߏ-MT\9-^b0,?qrjO6I6L~?}p-')KUJ.O>16rkqHi`'ToLΗ_GFpWO, ` @n, ǐ9H*7!MJ2}MD toȥz2T|2l 5[cN\ݨn|FBuV[`5%M.]y|zAEqT ]s(n/Tl>躳ؑĽmH:! \Lκ?&1i1^C"7v4rWb.*2=]= L9AAwGNnGMgϏuZ9g]媺{]C5CJE.Wb@+Bg̡ҔbNwgUh-YwnMlJ#w3[SNwnG̾$&H}{hޭ MM)3<>wk*i:툫I09wk^VڰDWlJ&ڔQxt_nO ?'9Y3/N<;YU$'WD)#,޽ʼNd^p_2~QẈxi0zN3{ ZEap\fz?f@8TK(92Psi>|6κ?؍-OONU= S>)kSmAvՙqgh Ez@Ȅ2R"v`.Fgtֽ-ܵ"{J5 j! Os81RLkѣ#/OWˁfPκBhl~?EUZlO*WrfaJ*Rɢ}O#TzP%X%rrl<E6Q"Ji D{v]Wp1'J6*K!_`ѤgYI}ņE V[oyIrlCك=l ~IYs%+xЌ&)h;^a՛{L=͈J&pk |@/؉|Gۑ-XLyK lpFXlK<M$>΂<zZ&Rljkǎ@Ja('ǚ6PD+ŴS=6H!q e$ͤb-0J_\l,U+J< GxD g`r2&g槴 I @Tށ[SZ[Dy0|r]oH{hnda$ D:!e WA_Ac$H*!xn8hBI:^D<"ްP$qP5IqKk.AfXpc+9$K!%'mTs{w=AZh)x0.G3ПXd99Qp@ >xjI4^ C,9bb_v(Vȃ9C'鱟f0:695Hb׎<"8%H0Gb'^ 5:s/GHj)#闾gHX)rF5"&s>X|n\'.8QF CL nrnļA 5k%-{ %e (<1Js#J%H+rMGW g)crseZ_d`HaJO9,*gHe ĐIb܈؝e3 nz8I]A2pSɄ/QHr9d['+4f tdMgT؛.VAb R0kX?r%0Ll>!;Z? AЗIԾ]VbmbT4G=GJ5"98kbQ:țJ!c*> o@U@Lm̓6YTsln(! P0RpUHX{jAk{S$fyǝpJz ٬g'2fm˸ޅN`ot,rxĆ H> Eµ6 w=~nB0`G:Ȉ SDuNAL/rsw5.ۤYoIykvAFԼh7 ž ^unG6$`LV5>YO$xl=&.DCD;PӚcT^\22ǥA>4Q%#rXJZš =9sB4Jσ&S rC9 LXpI8U ֔bSY[`o nm0,T^"%F(n A. s F9ĨP@zغEﭤA% LW8K4BlPrK(iJ.W"4}n zqaug{0T zi&C ݶ(m:rĸ~R덽W.Ewk\|'=Z_J>hJ$j4vi&k|:YR-D֗!ՎM:a>$ӕ1wS9"`+}^M JTw$X&;IpeXB$&6(V"ǜ-RFX[{WE?XV0֭`)d'mX<<GU6b+Ok%qI{ҐKI=% ՜ Ƽu>p#eTs@P?L-ՀV]VD Gթx߅eu<6FkG rBgA@h.$2RR WZpSgv"qWg֫y0&!iY @m9-OkXj#2/7!k}O)n| jp47a~Л\FΪp샔E'm'r1i-gf =!%/{$v_(,#l·Sy7M1:Ǟa4ueBEy.[(W&/nx>۽2®3qԩd 8VB>}tU:)Q )[^h"PM%vToWLŐ-^^x0xtŐG? .izVkaV<%-FM1jB !JqK0jysm.z U~|k f$ФӶXKwY|鮢3ߊ5;Q"fKC 6>Q JchD<2k-5a#4J q͠nM|/WV3|p4U!:O%c^ՠ$% HQ@J_yYJ'a0k¡x+%=iaBlӢ]Ŭz!bvm ]4h p@J?l=%)I!_:WA bPJ]LWƖ Wb0@}C$GJ;F>] _v/&F (@ &E<c+աdϿsHml/ӜmGJEiDAd)J02"B)K]VD)|PS,N̶"_Y5N4WԗP,:y$D}Mn:Zc^K"hHaQz6Hi4 <(xە7Jhj!eYXRVɬ)muvLnOq#'~؞!TQik>%oW" $duޝbO--3Y9-`Ax`7L:|`"<Ґe4 C++Rr]+ <|λǖ% hۑK)V)G.h k [>+JB!gQ>I*SV([r7߈ަ~hBuoG Z o]!Y IV3FmJҷ"!jzʥ-}!DJeV䇅Pہ 2':CE7iRmzÕ3Oܠ(%D+:"\Ea#ўE:=*T J)*"$W,XR:jΑb}Yt"W9/WF L U 3UP OMˎ\J3OH}%i6R'AK%RdO*"NYAJ4X/ pYJ1sfZGzAE=ֹMYEe,&ߘ9Uu:pǬmU렳 CdTt7;uPFǔ-ۙF4:i*B3uA~Ri3,^ 0FpHU$i*fK+C(' X̓yQ29o[;qj"B$99d1%om66[|S8W9RrL+|bSlҜh8mnm#i=ݰuDQi⧢6?ЃK41'D b4@Bi;lN923 2 X 8=~K͜ F`:~J`4f}OzM,j,_䁖JzBIjw[ tBKDʔyxS0TG&lgZøt\6HfwUt"г8^M]TFv@*2O>E~`R\(o򝋞WFϧT@厦vw4@O!atkwm-ͭ6m߃My$t{"H*Z^Er>Rċx?]i[fߜ™όy|{{ ? ng{zǕm8\I_sN9>Pcj t2Z=$ivvҟ)G$Nm٩1I7˫vї_twKBxwvxCLJ@?z@k;!5"(ϕKD^(HawәTY4y MVEkbpd{>zR^k)ۉM. FR 8JKy%" g0]jge"iu?ݍGܠ7/E0LJ[P89͹*ˡ|YYT^CyJQ FP_B ߩj䈫FenQhܡ>*\NRzRWU \AH)yRZA-6$(il!yRZA TN#L]7 5p"H8aRidQm(V. KPt&QUb)FkIDa\ӻNE|.M-{?>/;>"&t-gTuY0eVٵ!?٬Bt8Ջ徝̝ ;RցRʱwnu'Stsɭ=yw7m,z^#R0ժ n':i( EuF$;#x$YOxm )QՉG(\{pFj!Z ن:lh&еJ7fdNyL Xj$ub`e@@"g q⟖/zEf|ZנPWBGH@KF Af,=2gХ S…ԒH[䠖{U{~t,[*5IyZH"8뛯jߘkaozGHTf ?}_7)t3kE]5YԽdz^'9K~Ǹ;}*۽w>'%8= t>&k'q9cm|VBlLQB!'ڏ ڇ" ɪ#̌a͸Rv=QӔTt0qE)d'14 P@Y<1w~"DN$^fE~iD$7]%t4>mD-uA^Éӹ҄4T[4m] QTq*2Q}"J_/" f ҭ̨lQT]0#i,*"XeVX4lќ'HLSn^B/wy/WNNǜ;".Xf D{O)eԂSqK)ό*kO)Ii5 ~pR 78@ђ\u$^TZ} [,J,%"4T)k#JAT[*k"Z)׈+a9EY6#Bݝys?-Ҡ_ۊEִf`@yY=U4$x7Lg:3 6RAA Hu5/'K ܈JƠFՈv:!Y,ŗcPݙJއqi}/f`Nsw 9 6"z QXcHGn݆+An-0D5^zAd4m | & MEՉofsc' 2-~8)4 d3J6(&>WS%{ UC|$x?D'KklknQk\h gAM4!Ymc"wXq'ᣯqMMu0nM|ԶZQSpb8"c$A "?q8z2,`[?hKe`.cM'u!ozg h⧠?x'Ʋ!oE{a8>)Пcﱗ_#ݷ߮Pr3PfܽrN fuZ]D2bcCΚ|OT#F^{z\K9Կ wROP1.n ,Wjb.ղ|ۥZp)PÆYITPH5t'o`oL}y{ LK:-e&%?SMNR˴ћR+T'?9J@;WER*(H<>dk׶y Kȅ:㐢UKcQ[@ izx׷D/?b%<mb=^R#Gg-(U??!r"v.,{[ `͌@**-Sr3R!LRYׁ%Q͔>O7Սs]< n.Ex; ]_lՋz\H,SdJi0"sQM+j U4G<ch0Bh*ǕzeS 2CY s^ *"ƽV:@1+t&[5׵khۚҊkn-ŀZ/ZR`b6V&PՉwcSGLɗ`üSNJ ߞk۷t&_#0}K ":*UF`e6Uk!iyPub4=6m*o ӵɬWu-dٺ76We.6VZc[H#Di+.Bi6&B*\fy m%\=Rmc>xK5v#ҖRhU9^w1+d, Z^a2[&H[z#td;t6BVqKnt=R+soн{q_"wgzνߎ|_b9, $Kpe)<ސ@~C 31k-‚NjC3<@մf_:?(.}nm?tqEDݔ2ެn dҵG@.O)x H}Y:kʪryf ƬwPj{u_z^Exֽ ѡ'#L3pqaD[ͱ!SrdΙjUriYpTx*񉛽tGC%`[Y,&q2bߏΓ$fl .B1, Dz6̣ %礧YrRH=HCKgb1-TY*zZjCR%|_ ,oon x:˸w_O+u7'i(nѣoP0SC|nTW}Wn&ur/bϧĔ'3al ,^{b֊ t 9zrh<>zrxUeЇלJōAۺN_~:٫FLP"#r,=ZS{jP^#rvc\>Z$Po(F򘜢: ]:08 ]KTT9=x_;lF =Gc/MKK4KrJWf='^Y:-n5B&$]B}zR2>b dR3YHwϗYEei^T=\Yې1Wf\(~fxkWΜG38G_漓N&nzp|Jr0O@?W$_G_lh)`^[`e{:9]Y`uU@Pq*%ox?Fo7var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005661535515154033353017717 0ustar rootrootMar 10 14:01:34 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 14:01:34 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:34 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 14:01:35 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 14:01:35 crc kubenswrapper[4911]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 14:01:35 crc kubenswrapper[4911]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 14:01:35 crc kubenswrapper[4911]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 14:01:35 crc kubenswrapper[4911]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 14:01:35 crc kubenswrapper[4911]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 14:01:35 crc kubenswrapper[4911]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.905496 4911 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914497 4911 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914532 4911 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914544 4911 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914554 4911 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914563 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914572 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914580 4911 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914591 4911 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914602 4911 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914612 4911 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914621 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914630 4911 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914663 4911 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914672 4911 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914681 4911 feature_gate.go:330] unrecognized feature gate: Example Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914691 4911 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914699 4911 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914709 4911 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914717 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914758 4911 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914771 4911 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914785 4911 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914797 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914808 4911 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914819 4911 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914831 4911 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914845 4911 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914855 4911 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914865 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914873 4911 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914881 4911 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914890 4911 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914898 4911 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914907 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914915 4911 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914924 4911 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914931 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914941 4911 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914949 4911 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914957 4911 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914965 4911 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914973 4911 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914981 4911 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914989 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.914997 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915005 4911 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915012 4911 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915021 4911 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915028 4911 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915035 4911 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915043 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915051 4911 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915059 4911 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915067 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915075 4911 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915087 4911 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915098 4911 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915106 4911 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915117 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915125 4911 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915133 4911 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915141 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915149 4911 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915157 4911 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915164 4911 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915172 4911 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915179 4911 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915187 4911 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915195 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915203 4911 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.915210 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915384 4911 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915401 4911 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915419 4911 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915431 4911 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915443 4911 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915453 4911 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915470 4911 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915482 4911 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915491 4911 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915501 4911 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915511 4911 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915520 4911 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915530 4911 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915539 4911 flags.go:64] FLAG: --cgroup-root="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915548 4911 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915557 4911 flags.go:64] FLAG: --client-ca-file="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915566 4911 flags.go:64] FLAG: --cloud-config="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915574 4911 flags.go:64] FLAG: --cloud-provider="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915583 4911 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915596 4911 flags.go:64] FLAG: --cluster-domain="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915605 4911 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915615 4911 flags.go:64] FLAG: --config-dir="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915623 4911 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915634 4911 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915645 4911 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915655 4911 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915664 4911 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915673 4911 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915682 4911 flags.go:64] FLAG: --contention-profiling="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915692 4911 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915700 4911 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915710 4911 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915718 4911 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915764 4911 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915773 4911 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915782 4911 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915791 4911 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915800 4911 flags.go:64] FLAG: --enable-server="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915809 4911 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915822 4911 flags.go:64] FLAG: --event-burst="100" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915832 4911 flags.go:64] FLAG: --event-qps="50" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915840 4911 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915849 4911 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915860 4911 flags.go:64] FLAG: --eviction-hard="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915872 4911 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915881 4911 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915889 4911 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915899 4911 flags.go:64] FLAG: --eviction-soft="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915907 4911 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915916 4911 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915925 4911 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915936 4911 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915945 4911 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915954 4911 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915964 4911 flags.go:64] FLAG: --feature-gates="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.915990 4911 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916000 4911 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916009 4911 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916018 4911 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916027 4911 flags.go:64] FLAG: --healthz-port="10248" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916037 4911 flags.go:64] FLAG: --help="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916046 4911 flags.go:64] FLAG: --hostname-override="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916055 4911 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916064 4911 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916073 4911 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916082 4911 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916091 4911 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916100 4911 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916109 4911 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916118 4911 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916127 4911 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916136 4911 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916146 4911 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916155 4911 flags.go:64] FLAG: --kube-reserved="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916164 4911 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916173 4911 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916182 4911 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916191 4911 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916200 4911 flags.go:64] FLAG: --lock-file="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916210 4911 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916219 4911 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916228 4911 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916242 4911 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916251 4911 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916260 4911 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916269 4911 flags.go:64] FLAG: --logging-format="text" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916278 4911 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916288 4911 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916296 4911 flags.go:64] FLAG: --manifest-url="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916306 4911 flags.go:64] FLAG: --manifest-url-header="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916317 4911 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916327 4911 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916338 4911 flags.go:64] FLAG: --max-pods="110" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916347 4911 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916356 4911 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916365 4911 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916375 4911 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916384 4911 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916393 4911 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916403 4911 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916424 4911 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916433 4911 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916443 4911 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916452 4911 flags.go:64] FLAG: --pod-cidr="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916460 4911 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916473 4911 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916482 4911 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916491 4911 flags.go:64] FLAG: --pods-per-core="0" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916500 4911 flags.go:64] FLAG: --port="10250" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916510 4911 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916518 4911 flags.go:64] FLAG: --provider-id="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916528 4911 flags.go:64] FLAG: --qos-reserved="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916537 4911 flags.go:64] FLAG: --read-only-port="10255" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916546 4911 flags.go:64] FLAG: --register-node="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916555 4911 flags.go:64] FLAG: --register-schedulable="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916565 4911 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916582 4911 flags.go:64] FLAG: --registry-burst="10" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916591 4911 flags.go:64] FLAG: --registry-qps="5" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916600 4911 flags.go:64] FLAG: --reserved-cpus="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916608 4911 flags.go:64] FLAG: --reserved-memory="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916620 4911 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916629 4911 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916639 4911 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916648 4911 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916657 4911 flags.go:64] FLAG: --runonce="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916665 4911 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916675 4911 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916685 4911 flags.go:64] FLAG: --seccomp-default="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916694 4911 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916702 4911 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916712 4911 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916721 4911 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916756 4911 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916765 4911 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916775 4911 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916783 4911 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916792 4911 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916802 4911 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916811 4911 flags.go:64] FLAG: --system-cgroups="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916820 4911 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916834 4911 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916843 4911 flags.go:64] FLAG: --tls-cert-file="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916851 4911 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916862 4911 flags.go:64] FLAG: --tls-min-version="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916871 4911 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916880 4911 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916894 4911 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916903 4911 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916919 4911 flags.go:64] FLAG: --v="2" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916931 4911 flags.go:64] FLAG: --version="false" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916943 4911 flags.go:64] FLAG: --vmodule="" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916956 4911 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.916966 4911 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917178 4911 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917189 4911 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917198 4911 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917207 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917218 4911 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917228 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917237 4911 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917246 4911 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917255 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917264 4911 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917272 4911 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917280 4911 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917288 4911 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917299 4911 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917308 4911 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917317 4911 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917326 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917335 4911 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917343 4911 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917352 4911 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917360 4911 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917368 4911 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917376 4911 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917384 4911 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917392 4911 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917403 4911 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917414 4911 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917426 4911 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917435 4911 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917443 4911 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917451 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917460 4911 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917467 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917475 4911 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917484 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917492 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917500 4911 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917508 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917516 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917524 4911 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917532 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917540 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917548 4911 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917556 4911 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917564 4911 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917572 4911 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917580 4911 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917588 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917595 4911 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917603 4911 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917611 4911 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917618 4911 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917627 4911 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917635 4911 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917642 4911 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917650 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917658 4911 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917668 4911 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917676 4911 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917689 4911 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917699 4911 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917708 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917717 4911 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917750 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917759 4911 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917767 4911 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917776 4911 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917784 4911 feature_gate.go:330] unrecognized feature gate: Example Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917793 4911 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917801 4911 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.917810 4911 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.917892 4911 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.929748 4911 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.929809 4911 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.929965 4911 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.929988 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930007 4911 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930017 4911 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930027 4911 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930035 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930043 4911 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930052 4911 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930060 4911 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930069 4911 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930078 4911 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930085 4911 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930094 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930102 4911 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930110 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930119 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930128 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930139 4911 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930152 4911 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930162 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930171 4911 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930179 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930187 4911 feature_gate.go:330] unrecognized feature gate: Example Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930195 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930202 4911 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930210 4911 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930218 4911 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930226 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930234 4911 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930242 4911 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930249 4911 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930257 4911 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930265 4911 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930272 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930282 4911 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930289 4911 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930332 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930340 4911 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930348 4911 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930356 4911 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930364 4911 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930375 4911 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930385 4911 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930394 4911 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930402 4911 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930410 4911 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930419 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930427 4911 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930436 4911 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930445 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930453 4911 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930461 4911 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930470 4911 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930478 4911 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930486 4911 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930494 4911 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930502 4911 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930510 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930517 4911 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930525 4911 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930535 4911 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930544 4911 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930553 4911 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930562 4911 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930570 4911 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930578 4911 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930585 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930593 4911 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930601 4911 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930611 4911 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930621 4911 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.930635 4911 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930902 4911 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930919 4911 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930931 4911 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930942 4911 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930951 4911 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930961 4911 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930969 4911 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930977 4911 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930988 4911 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.930997 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931006 4911 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931014 4911 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931022 4911 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931030 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931037 4911 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931045 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931053 4911 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931060 4911 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931068 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931075 4911 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931083 4911 feature_gate.go:330] unrecognized feature gate: Example Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931091 4911 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931098 4911 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931106 4911 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931113 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931121 4911 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931129 4911 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931136 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931144 4911 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931156 4911 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931167 4911 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931177 4911 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931186 4911 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931196 4911 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931209 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931217 4911 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931226 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931236 4911 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931246 4911 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931256 4911 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931266 4911 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931277 4911 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931287 4911 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931296 4911 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931304 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931314 4911 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931324 4911 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931335 4911 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931343 4911 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931352 4911 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931360 4911 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931368 4911 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931377 4911 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931384 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931392 4911 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931399 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931407 4911 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931415 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931422 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931430 4911 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931437 4911 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931445 4911 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931452 4911 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931460 4911 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931467 4911 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931476 4911 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931485 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931494 4911 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931502 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931510 4911 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 14:01:35 crc kubenswrapper[4911]: W0310 14:01:35.931518 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.931532 4911 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.933111 4911 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 14:01:35 crc kubenswrapper[4911]: E0310 14:01:35.938921 4911 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.943588 4911 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.943820 4911 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.945795 4911 server.go:997] "Starting client certificate rotation" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.945837 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.946018 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.974887 4911 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 14:01:35 crc kubenswrapper[4911]: I0310 14:01:35.978046 4911 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 14:01:35 crc kubenswrapper[4911]: E0310 14:01:35.978464 4911 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.007920 4911 log.go:25] "Validated CRI v1 runtime API" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.051850 4911 log.go:25] "Validated CRI v1 image API" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.056317 4911 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.061135 4911 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-13-57-06-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.061612 4911 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.090675 4911 manager.go:217] Machine: {Timestamp:2026-03-10 14:01:36.087111038 +0000 UTC m=+0.650630985 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:74bee069-21da-4cc8-a69e-a4f54ba3e964 BootID:36e9ec0c-5432-482b-b6c8-9d6220be2548 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:99:04:b7 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:99:04:b7 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:95:b4:f9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:84:8e:6f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:21:87:86 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:58:9c:1a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ca:22:a2:8f:d0:0d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e2:f4:41:fe:21:41 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.091244 4911 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.091448 4911 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.093210 4911 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.093532 4911 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.093600 4911 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.093996 4911 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.094016 4911 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.095114 4911 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.095170 4911 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.095497 4911 state_mem.go:36] "Initialized new in-memory state store" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.095670 4911 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.100268 4911 kubelet.go:418] "Attempting to sync node with API server" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.100328 4911 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.100387 4911 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.100415 4911 kubelet.go:324] "Adding apiserver pod source" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.100440 4911 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.105848 4911 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.106908 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.107000 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.107122 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.107053 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.107326 4911 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.109099 4911 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.110944 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111001 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111018 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111034 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111058 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111073 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111091 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111117 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111133 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111147 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111186 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.111204 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.112309 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.113078 4911 server.go:1280] "Started kubelet" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.114543 4911 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.114561 4911 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 14:01:36 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.118552 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.118852 4911 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.121027 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.121077 4911 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.121185 4911 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.121214 4911 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.121380 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.121921 4911 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.125128 4911 factory.go:55] Registering systemd factory Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.125172 4911 factory.go:221] Registration of the systemd container factory successfully Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.125380 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.128198 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="200ms" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.128306 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.136116 4911 server.go:460] "Adding debug handlers to kubelet server" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.136168 4911 factory.go:153] Registering CRI-O factory Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.136201 4911 factory.go:221] Registration of the crio container factory successfully Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.136317 4911 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.136351 4911 factory.go:103] Registering Raw factory Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.136380 4911 manager.go:1196] Started watching for new ooms in manager Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.135626 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.153:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b7fb591d740ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.113033422 +0000 UTC m=+0.676553379,LastTimestamp:2026-03-10 14:01:36.113033422 +0000 UTC m=+0.676553379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.137871 4911 manager.go:319] Starting recovery of all containers Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151440 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151554 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151586 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151615 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151643 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151670 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151702 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151767 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151801 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151828 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151854 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151881 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151903 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151936 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151965 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.151993 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152020 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152047 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152074 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152102 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152232 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152255 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152274 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152297 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152317 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152336 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152360 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152382 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152407 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152429 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152447 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152466 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152484 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152504 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152523 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152546 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152566 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152584 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152604 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152623 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152642 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152661 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152679 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152698 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152717 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152769 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152789 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152812 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152831 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152851 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152869 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152890 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152917 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152938 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152964 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.152992 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153019 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153044 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153071 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153098 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153123 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153145 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153165 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153187 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153206 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153224 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153243 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153265 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153284 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153303 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153321 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153340 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153361 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153382 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153400 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153418 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153436 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153456 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153475 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153496 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153516 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153533 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153551 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153574 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153594 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153613 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153631 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153648 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153666 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153683 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153703 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153804 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153825 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153864 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153884 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153917 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153938 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153957 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153974 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.153995 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154015 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154034 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154055 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154074 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154105 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154126 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154147 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154170 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154237 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154258 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154289 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154321 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154342 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154362 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154381 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154399 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154417 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154437 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154455 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154475 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154491 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154518 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154538 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154556 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154575 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154597 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154616 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154636 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154654 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154672 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154691 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154710 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154755 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154776 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154794 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154817 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154837 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154855 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154874 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154893 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154913 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154933 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154956 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154976 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.154997 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155016 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155035 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155055 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155074 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155093 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155112 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155132 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155153 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155172 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155190 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155210 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155229 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155247 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155268 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155289 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155309 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.155328 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.163585 4911 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.163695 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.163762 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164031 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164084 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164393 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164432 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164459 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164494 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164534 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164566 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164590 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164616 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164653 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164679 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164701 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164763 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164790 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164831 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164855 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164879 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164914 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164942 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164972 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.164995 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165021 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165055 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165078 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165107 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165131 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165157 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165196 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165228 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165267 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165303 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165325 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165354 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165376 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165405 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165427 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165452 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165482 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165504 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165529 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165557 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165577 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165599 4911 reconstruct.go:97] "Volume reconstruction finished" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.165615 4911 reconciler.go:26] "Reconciler: start to sync state" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.177621 4911 manager.go:324] Recovery completed Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.186387 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.186954 4911 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.187878 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.187922 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.187935 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.189004 4911 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.189026 4911 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.189045 4911 state_mem.go:36] "Initialized new in-memory state store" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.189624 4911 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.189704 4911 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.191409 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.191717 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.191881 4911 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.192143 4911 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.215321 4911 policy_none.go:49] "None policy: Start" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.216838 4911 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.216992 4911 state_mem.go:35] "Initializing new in-memory state store" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.221697 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.293129 4911 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.297166 4911 manager.go:334] "Starting Device Plugin manager" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.297397 4911 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.297420 4911 server.go:79] "Starting device plugin registration server" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.297880 4911 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.297943 4911 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.298153 4911 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.298334 4911 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.298353 4911 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.310848 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.328955 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="400ms" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.398479 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.400426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.400512 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.400534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.400679 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.401715 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.153:6443: connect: connection refused" node="crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.493632 4911 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.493911 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.495956 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.496041 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.496061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.496312 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.496809 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.496909 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.497961 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.498081 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.498104 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.498340 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.498370 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.498444 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.498474 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.498753 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.498926 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500266 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500307 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500327 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500325 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500485 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500526 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500814 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500917 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.500987 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.502459 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.502514 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.502534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.502767 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.502962 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.503055 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.503823 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.503886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504261 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504517 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504556 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504576 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504577 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504639 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504875 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.504928 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.506291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.506342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.506361 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.570864 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.571064 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.571227 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.571269 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.571450 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.571530 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.571694 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.571830 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.572095 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.572288 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.572394 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.572556 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.572595 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.572831 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.572936 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.602088 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.603975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.604039 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.604060 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.604104 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.604855 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.153:6443: connect: connection refused" node="crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.677889 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.677975 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678005 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678033 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678061 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678084 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678109 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678142 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678183 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678211 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678218 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678268 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678318 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678317 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678380 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678418 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678238 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678283 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678438 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678508 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678571 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678582 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678620 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678653 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678502 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678688 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678766 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678774 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678792 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.678941 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: E0310 14:01:36.730937 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="800ms" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.842030 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.853924 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.891837 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.903103 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: I0310 14:01:36.908905 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.926078 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-85c24e635580c106bd55bcdba81f928bea97703b647c3af0272bd22ab29503de WatchSource:0}: Error finding container 85c24e635580c106bd55bcdba81f928bea97703b647c3af0272bd22ab29503de: Status 404 returned error can't find the container with id 85c24e635580c106bd55bcdba81f928bea97703b647c3af0272bd22ab29503de Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.931065 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-39993e9a03c914137182abecfe49064785386939baa9be7af78faff7c7ccefd2 WatchSource:0}: Error finding container 39993e9a03c914137182abecfe49064785386939baa9be7af78faff7c7ccefd2: Status 404 returned error can't find the container with id 39993e9a03c914137182abecfe49064785386939baa9be7af78faff7c7ccefd2 Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.943404 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2f594f82969838d520e649023792e01fcfccc2456ea261dc1ac5699010c3fddd WatchSource:0}: Error finding container 2f594f82969838d520e649023792e01fcfccc2456ea261dc1ac5699010c3fddd: Status 404 returned error can't find the container with id 2f594f82969838d520e649023792e01fcfccc2456ea261dc1ac5699010c3fddd Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.955623 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a197279c9c5c08fd5e993134a1c02eed5c188a8f7432888d84468e73189e723b WatchSource:0}: Error finding container a197279c9c5c08fd5e993134a1c02eed5c188a8f7432888d84468e73189e723b: Status 404 returned error can't find the container with id a197279c9c5c08fd5e993134a1c02eed5c188a8f7432888d84468e73189e723b Mar 10 14:01:36 crc kubenswrapper[4911]: W0310 14:01:36.959619 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-33077507351c99e1735109ba1fc963c648929e202842ed7ec722f52272fc1c3a WatchSource:0}: Error finding container 33077507351c99e1735109ba1fc963c648929e202842ed7ec722f52272fc1c3a: Status 404 returned error can't find the container with id 33077507351c99e1735109ba1fc963c648929e202842ed7ec722f52272fc1c3a Mar 10 14:01:37 crc kubenswrapper[4911]: W0310 14:01:37.001587 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:37 crc kubenswrapper[4911]: E0310 14:01:37.001825 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.005403 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.008411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.008474 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.008500 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.008551 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:01:37 crc kubenswrapper[4911]: E0310 14:01:37.009146 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.153:6443: connect: connection refused" node="crc" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.120440 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.198094 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2f594f82969838d520e649023792e01fcfccc2456ea261dc1ac5699010c3fddd"} Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.199942 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"39993e9a03c914137182abecfe49064785386939baa9be7af78faff7c7ccefd2"} Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.201711 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"85c24e635580c106bd55bcdba81f928bea97703b647c3af0272bd22ab29503de"} Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.203376 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"33077507351c99e1735109ba1fc963c648929e202842ed7ec722f52272fc1c3a"} Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.204922 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a197279c9c5c08fd5e993134a1c02eed5c188a8f7432888d84468e73189e723b"} Mar 10 14:01:37 crc kubenswrapper[4911]: W0310 14:01:37.289171 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:37 crc kubenswrapper[4911]: E0310 14:01:37.289306 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:37 crc kubenswrapper[4911]: E0310 14:01:37.532079 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="1.6s" Mar 10 14:01:37 crc kubenswrapper[4911]: W0310 14:01:37.627713 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:37 crc kubenswrapper[4911]: E0310 14:01:37.627903 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:37 crc kubenswrapper[4911]: W0310 14:01:37.661878 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:37 crc kubenswrapper[4911]: E0310 14:01:37.661988 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.809956 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.811669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.811716 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.811757 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.811786 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:01:37 crc kubenswrapper[4911]: E0310 14:01:37.812426 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.153:6443: connect: connection refused" node="crc" Mar 10 14:01:37 crc kubenswrapper[4911]: I0310 14:01:37.996579 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 14:01:37 crc kubenswrapper[4911]: E0310 14:01:37.999166 4911 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.119635 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.211968 4911 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0" exitCode=0 Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.212040 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0"} Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.212370 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.215561 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.215609 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.215625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.215791 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12" exitCode=0 Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.215827 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12"} Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.215887 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.217111 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.217187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.217202 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.217880 4911 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9" exitCode=0 Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.217970 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9"} Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.217977 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.219021 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.219086 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.219109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.219112 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.220548 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.220582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.220598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.220615 4911 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f" exitCode=0 Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.220678 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f"} Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.220761 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.221882 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.221913 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.221925 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.224011 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5"} Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.224186 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1"} Mar 10 14:01:38 crc kubenswrapper[4911]: I0310 14:01:38.224219 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.119903 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:39 crc kubenswrapper[4911]: E0310 14:01:39.133798 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="3.2s" Mar 10 14:01:39 crc kubenswrapper[4911]: W0310 14:01:39.213746 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.153:6443: connect: connection refused Mar 10 14:01:39 crc kubenswrapper[4911]: E0310 14:01:39.213857 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.153:6443: connect: connection refused" logger="UnhandledError" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.228996 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.229039 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.230182 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.230225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.230240 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.231590 4911 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780" exitCode=0 Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.231675 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.231686 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.232266 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.232298 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.232307 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.237459 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.237550 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.237647 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.237740 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.239318 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.239498 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.244553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.244626 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.244640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.255232 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.255968 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.255942 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.256168 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842"} Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.257135 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.257182 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.257194 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.412564 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.413781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.413829 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.413841 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:39 crc kubenswrapper[4911]: I0310 14:01:39.413873 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:01:39 crc kubenswrapper[4911]: E0310 14:01:39.414334 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.153:6443: connect: connection refused" node="crc" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.265637 4911 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841" exitCode=0 Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.265684 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841"} Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.265828 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.267554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.267588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.267599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.273125 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ece7ea99b8b7fee7df200ed60bb87bc6cde184734478f0cf242ffda34dbe3258"} Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.273207 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.273252 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.273270 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.273250 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.273361 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.274254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.274287 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.274296 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275040 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275050 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275074 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275132 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.275392 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.578884 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.587297 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.760177 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:40 crc kubenswrapper[4911]: I0310 14:01:40.985552 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.265608 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279381 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279812 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71"} Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279852 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7"} Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279863 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a"} Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279872 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe"} Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279880 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5"} Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279902 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279919 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.279927 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.280848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.280859 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.280887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.280899 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.280870 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.280959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.281759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.281790 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:41 crc kubenswrapper[4911]: I0310 14:01:41.281802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.206397 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.233292 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.282796 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.283409 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.283652 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.283838 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288551 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288557 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288654 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288674 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288917 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288939 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.288954 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.615296 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.616912 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.616969 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.616986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:42 crc kubenswrapper[4911]: I0310 14:01:42.617020 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.286559 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.286673 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.288372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.288440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.288456 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.369408 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.369649 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.371309 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.371378 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.371392 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.724032 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.761263 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 14:01:43 crc kubenswrapper[4911]: I0310 14:01:43.761417 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 14:01:44 crc kubenswrapper[4911]: I0310 14:01:44.289798 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:44 crc kubenswrapper[4911]: I0310 14:01:44.291532 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:44 crc kubenswrapper[4911]: I0310 14:01:44.291640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:44 crc kubenswrapper[4911]: I0310 14:01:44.291662 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:45 crc kubenswrapper[4911]: I0310 14:01:45.574675 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:01:45 crc kubenswrapper[4911]: I0310 14:01:45.575001 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:45 crc kubenswrapper[4911]: I0310 14:01:45.576504 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:45 crc kubenswrapper[4911]: I0310 14:01:45.576541 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:45 crc kubenswrapper[4911]: I0310 14:01:45.576550 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:46 crc kubenswrapper[4911]: E0310 14:01:46.311451 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 14:01:46 crc kubenswrapper[4911]: I0310 14:01:46.554549 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:46 crc kubenswrapper[4911]: I0310 14:01:46.554813 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:46 crc kubenswrapper[4911]: I0310 14:01:46.556263 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:46 crc kubenswrapper[4911]: I0310 14:01:46.556369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:46 crc kubenswrapper[4911]: I0310 14:01:46.556391 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:49 crc kubenswrapper[4911]: W0310 14:01:49.827825 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 14:01:49 crc kubenswrapper[4911]: I0310 14:01:49.827942 4911 trace.go:236] Trace[355199140]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 14:01:39.826) (total time: 10001ms): Mar 10 14:01:49 crc kubenswrapper[4911]: Trace[355199140]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:01:49.827) Mar 10 14:01:49 crc kubenswrapper[4911]: Trace[355199140]: [10.001288176s] [10.001288176s] END Mar 10 14:01:49 crc kubenswrapper[4911]: E0310 14:01:49.827968 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 14:01:49 crc kubenswrapper[4911]: E0310 14:01:49.996062 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189b7fb591d740ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.113033422 +0000 UTC m=+0.676553379,LastTimestamp:2026-03-10 14:01:36.113033422 +0000 UTC m=+0.676553379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:01:50 crc kubenswrapper[4911]: W0310 14:01:50.104494 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.104587 4911 trace.go:236] Trace[1280284509]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 14:01:40.103) (total time: 10000ms): Mar 10 14:01:50 crc kubenswrapper[4911]: Trace[1280284509]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (14:01:50.104) Mar 10 14:01:50 crc kubenswrapper[4911]: Trace[1280284509]: [10.000905303s] [10.000905303s] END Mar 10 14:01:50 crc kubenswrapper[4911]: E0310 14:01:50.104606 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.120479 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 14:01:50 crc kubenswrapper[4911]: E0310 14:01:50.463277 4911 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 14:01:50 crc kubenswrapper[4911]: E0310 14:01:50.467111 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:50Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 14:01:50 crc kubenswrapper[4911]: E0310 14:01:50.469215 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.470637 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.470684 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 14:01:50 crc kubenswrapper[4911]: W0310 14:01:50.471825 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:50Z is after 2026-02-23T05:33:13Z Mar 10 14:01:50 crc kubenswrapper[4911]: E0310 14:01:50.471935 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 14:01:50 crc kubenswrapper[4911]: W0310 14:01:50.477557 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:50Z is after 2026-02-23T05:33:13Z Mar 10 14:01:50 crc kubenswrapper[4911]: E0310 14:01:50.477682 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.480231 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.480300 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.678564 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.678780 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.679888 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.679945 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.679958 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.732036 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.989966 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.990106 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.991295 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.991402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:50 crc kubenswrapper[4911]: I0310 14:01:50.991446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.124332 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:51Z is after 2026-02-23T05:33:13Z Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.312800 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.315299 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ece7ea99b8b7fee7df200ed60bb87bc6cde184734478f0cf242ffda34dbe3258" exitCode=255 Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.315504 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.315773 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ece7ea99b8b7fee7df200ed60bb87bc6cde184734478f0cf242ffda34dbe3258"} Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.316046 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.316714 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.316789 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.316810 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.317528 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.317581 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.317599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.318438 4911 scope.go:117] "RemoveContainer" containerID="ece7ea99b8b7fee7df200ed60bb87bc6cde184734478f0cf242ffda34dbe3258" Mar 10 14:01:51 crc kubenswrapper[4911]: I0310 14:01:51.341855 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.124136 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:52Z is after 2026-02-23T05:33:13Z Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.240625 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.321614 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.322311 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.325564 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc" exitCode=255 Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.325663 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc"} Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.325814 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.325842 4911 scope.go:117] "RemoveContainer" containerID="ece7ea99b8b7fee7df200ed60bb87bc6cde184734478f0cf242ffda34dbe3258" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.325846 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.327721 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.327788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.327811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.327831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.327835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.327991 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.329082 4911 scope.go:117] "RemoveContainer" containerID="674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc" Mar 10 14:01:52 crc kubenswrapper[4911]: E0310 14:01:52.329449 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:01:52 crc kubenswrapper[4911]: I0310 14:01:52.334249 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.124418 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:53Z is after 2026-02-23T05:33:13Z Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.330752 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.333905 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.336109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.336187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.336212 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.337304 4911 scope.go:117] "RemoveContainer" containerID="674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc" Mar 10 14:01:53 crc kubenswrapper[4911]: E0310 14:01:53.337714 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.724525 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.761146 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 14:01:53 crc kubenswrapper[4911]: I0310 14:01:53.761261 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 14:01:54 crc kubenswrapper[4911]: W0310 14:01:54.045459 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:54Z is after 2026-02-23T05:33:13Z Mar 10 14:01:54 crc kubenswrapper[4911]: E0310 14:01:54.045627 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 14:01:54 crc kubenswrapper[4911]: I0310 14:01:54.124907 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:54Z is after 2026-02-23T05:33:13Z Mar 10 14:01:54 crc kubenswrapper[4911]: I0310 14:01:54.336526 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:54 crc kubenswrapper[4911]: I0310 14:01:54.338027 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:54 crc kubenswrapper[4911]: I0310 14:01:54.338093 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:54 crc kubenswrapper[4911]: I0310 14:01:54.338113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:54 crc kubenswrapper[4911]: I0310 14:01:54.339330 4911 scope.go:117] "RemoveContainer" containerID="674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc" Mar 10 14:01:54 crc kubenswrapper[4911]: E0310 14:01:54.339658 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:01:54 crc kubenswrapper[4911]: W0310 14:01:54.661835 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:54Z is after 2026-02-23T05:33:13Z Mar 10 14:01:54 crc kubenswrapper[4911]: E0310 14:01:54.661967 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 14:01:54 crc kubenswrapper[4911]: W0310 14:01:54.777678 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:54Z is after 2026-02-23T05:33:13Z Mar 10 14:01:54 crc kubenswrapper[4911]: E0310 14:01:54.777849 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 14:01:55 crc kubenswrapper[4911]: I0310 14:01:55.125827 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:55Z is after 2026-02-23T05:33:13Z Mar 10 14:01:55 crc kubenswrapper[4911]: I0310 14:01:55.340164 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:55 crc kubenswrapper[4911]: I0310 14:01:55.344459 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:55 crc kubenswrapper[4911]: I0310 14:01:55.344532 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:55 crc kubenswrapper[4911]: I0310 14:01:55.344570 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:55 crc kubenswrapper[4911]: I0310 14:01:55.347204 4911 scope.go:117] "RemoveContainer" containerID="674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc" Mar 10 14:01:55 crc kubenswrapper[4911]: E0310 14:01:55.347818 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.092659 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.122179 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:01:56Z is after 2026-02-23T05:33:13Z Mar 10 14:01:56 crc kubenswrapper[4911]: E0310 14:01:56.311579 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.342106 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.343175 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.343214 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.343224 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.343695 4911 scope.go:117] "RemoveContainer" containerID="674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc" Mar 10 14:01:56 crc kubenswrapper[4911]: E0310 14:01:56.343869 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.870234 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.872110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.872242 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.872272 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:01:56 crc kubenswrapper[4911]: I0310 14:01:56.872329 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:01:56 crc kubenswrapper[4911]: E0310 14:01:56.876801 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 14:01:56 crc kubenswrapper[4911]: E0310 14:01:56.877505 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 14:01:57 crc kubenswrapper[4911]: I0310 14:01:57.125873 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:01:58 crc kubenswrapper[4911]: I0310 14:01:58.127242 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:01:58 crc kubenswrapper[4911]: I0310 14:01:58.729247 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 14:01:58 crc kubenswrapper[4911]: I0310 14:01:58.751999 4911 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 14:01:59 crc kubenswrapper[4911]: I0310 14:01:59.125466 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:01:59 crc kubenswrapper[4911]: W0310 14:01:59.626051 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 14:01:59 crc kubenswrapper[4911]: E0310 14:01:59.626171 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.006589 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb591d740ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.113033422 +0000 UTC m=+0.676553379,LastTimestamp:2026-03-10 14:01:36.113033422 +0000 UTC m=+0.676553379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.013385 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964db125 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,LastTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.017687 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e147b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,LastTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.023263 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e41f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187941363 +0000 UTC m=+0.751461280,LastTimestamp:2026-03-10 14:01:36.187941363 +0000 UTC m=+0.751461280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.028961 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb59d0b2249 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.300982857 +0000 UTC m=+0.864502784,LastTimestamp:2026-03-10 14:01:36.300982857 +0000 UTC m=+0.864502784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.034404 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964db125\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964db125 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,LastTimestamp:2026-03-10 14:01:36.400487423 +0000 UTC m=+0.964007370,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.038907 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e147b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e147b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,LastTimestamp:2026-03-10 14:01:36.400526833 +0000 UTC m=+0.964046780,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.043604 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e41f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e41f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187941363 +0000 UTC m=+0.751461280,LastTimestamp:2026-03-10 14:01:36.400544472 +0000 UTC m=+0.964064419,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.047775 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964db125\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964db125 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,LastTimestamp:2026-03-10 14:01:36.496012787 +0000 UTC m=+1.059532744,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.053255 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e147b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e147b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,LastTimestamp:2026-03-10 14:01:36.496054957 +0000 UTC m=+1.059574904,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.058675 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e41f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e41f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187941363 +0000 UTC m=+0.751461280,LastTimestamp:2026-03-10 14:01:36.496071837 +0000 UTC m=+1.059591794,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.063701 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964db125\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964db125 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,LastTimestamp:2026-03-10 14:01:36.498047389 +0000 UTC m=+1.061567346,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.069994 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e147b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e147b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,LastTimestamp:2026-03-10 14:01:36.498097099 +0000 UTC m=+1.061617056,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.078284 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e41f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e41f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187941363 +0000 UTC m=+0.751461280,LastTimestamp:2026-03-10 14:01:36.498116918 +0000 UTC m=+1.061636885,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.086361 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964db125\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964db125 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,LastTimestamp:2026-03-10 14:01:36.498392756 +0000 UTC m=+1.061912713,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.093552 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e147b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e147b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,LastTimestamp:2026-03-10 14:01:36.498466635 +0000 UTC m=+1.061986592,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.100660 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e41f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e41f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187941363 +0000 UTC m=+0.751461280,LastTimestamp:2026-03-10 14:01:36.498768022 +0000 UTC m=+1.062287979,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.105356 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964db125\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964db125 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,LastTimestamp:2026-03-10 14:01:36.500295979 +0000 UTC m=+1.063815936,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.111626 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e147b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e147b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,LastTimestamp:2026-03-10 14:01:36.500319998 +0000 UTC m=+1.063839955,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.117661 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e41f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e41f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187941363 +0000 UTC m=+0.751461280,LastTimestamp:2026-03-10 14:01:36.500339938 +0000 UTC m=+1.063859895,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.123068 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964db125\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964db125 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,LastTimestamp:2026-03-10 14:01:36.500462847 +0000 UTC m=+1.063982814,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: I0310 14:02:00.123226 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.125671 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e147b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e147b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,LastTimestamp:2026-03-10 14:01:36.500514467 +0000 UTC m=+1.064034424,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.130373 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e41f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e41f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187941363 +0000 UTC m=+0.751461280,LastTimestamp:2026-03-10 14:01:36.500541426 +0000 UTC m=+1.064061383,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.136278 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964db125\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964db125 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187904293 +0000 UTC m=+0.751424210,LastTimestamp:2026-03-10 14:01:36.502489909 +0000 UTC m=+1.066009866,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.142180 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b7fb5964e147b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b7fb5964e147b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.187929723 +0000 UTC m=+0.751449640,LastTimestamp:2026-03-10 14:01:36.502527538 +0000 UTC m=+1.066047495,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.151776 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb5c321c1e8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.93999972 +0000 UTC m=+1.503519667,LastTimestamp:2026-03-10 14:01:36.93999972 +0000 UTC m=+1.503519667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.156785 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb5c3233383 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.940094339 +0000 UTC m=+1.503614296,LastTimestamp:2026-03-10 14:01:36.940094339 +0000 UTC m=+1.503614296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.161676 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b7fb5c41ff37e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.956658558 +0000 UTC m=+1.520178505,LastTimestamp:2026-03-10 14:01:36.956658558 +0000 UTC m=+1.520178505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.168497 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb5c4ab7d3d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.965803325 +0000 UTC m=+1.529323282,LastTimestamp:2026-03-10 14:01:36.965803325 +0000 UTC m=+1.529323282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.173607 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb5c4ac20ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:36.965845165 +0000 UTC m=+1.529365112,LastTimestamp:2026-03-10 14:01:36.965845165 +0000 UTC m=+1.529365112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.178479 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb5e9334f87 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.578684295 +0000 UTC m=+2.142204222,LastTimestamp:2026-03-10 14:01:37.578684295 +0000 UTC m=+2.142204222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.186836 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb5e9441583 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.579783555 +0000 UTC m=+2.143303482,LastTimestamp:2026-03-10 14:01:37.579783555 +0000 UTC m=+2.143303482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.194867 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb5e9870fb9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.584172985 +0000 UTC m=+2.147692912,LastTimestamp:2026-03-10 14:01:37.584172985 +0000 UTC m=+2.147692912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.201271 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb5e9c950a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.588514976 +0000 UTC m=+2.152034903,LastTimestamp:2026-03-10 14:01:37.588514976 +0000 UTC m=+2.152034903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.207130 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb5e9dc9136 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.589776694 +0000 UTC m=+2.153296631,LastTimestamp:2026-03-10 14:01:37.589776694 +0000 UTC m=+2.153296631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.213781 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b7fb5e9e1a5dd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.590109661 +0000 UTC m=+2.153629588,LastTimestamp:2026-03-10 14:01:37.590109661 +0000 UTC m=+2.153629588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.219236 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb5ea16cf5a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.59359369 +0000 UTC m=+2.157113607,LastTimestamp:2026-03-10 14:01:37.59359369 +0000 UTC m=+2.157113607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.223783 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb5ea2bd271 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.594970737 +0000 UTC m=+2.158490654,LastTimestamp:2026-03-10 14:01:37.594970737 +0000 UTC m=+2.158490654,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.228445 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb5ea979df9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.602035193 +0000 UTC m=+2.165555120,LastTimestamp:2026-03-10 14:01:37.602035193 +0000 UTC m=+2.165555120,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.232653 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb5eaabe195 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.603363221 +0000 UTC m=+2.166883148,LastTimestamp:2026-03-10 14:01:37.603363221 +0000 UTC m=+2.166883148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.238085 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b7fb5eb4ac364 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.613775716 +0000 UTC m=+2.177295623,LastTimestamp:2026-03-10 14:01:37.613775716 +0000 UTC m=+2.177295623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.245257 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb6000d8933 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.962084659 +0000 UTC m=+2.525604586,LastTimestamp:2026-03-10 14:01:37.962084659 +0000 UTC m=+2.525604586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.251974 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb600d5e802 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.97521613 +0000 UTC m=+2.538736057,LastTimestamp:2026-03-10 14:01:37.97521613 +0000 UTC m=+2.538736057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.257907 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb600e9231b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.976476443 +0000 UTC m=+2.539996400,LastTimestamp:2026-03-10 14:01:37.976476443 +0000 UTC m=+2.539996400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.268372 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb60cbc7d08 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.174876936 +0000 UTC m=+2.738396853,LastTimestamp:2026-03-10 14:01:38.174876936 +0000 UTC m=+2.738396853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.275469 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb60da85f5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.190335837 +0000 UTC m=+2.753855764,LastTimestamp:2026-03-10 14:01:38.190335837 +0000 UTC m=+2.753855764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.283430 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb60dc83f41 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.192424769 +0000 UTC m=+2.755944686,LastTimestamp:2026-03-10 14:01:38.192424769 +0000 UTC m=+2.755944686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.291696 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb60f5a6442 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.218779714 +0000 UTC m=+2.782299651,LastTimestamp:2026-03-10 14:01:38.218779714 +0000 UTC m=+2.782299651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.297470 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb60f6bcaa1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.219920033 +0000 UTC m=+2.783439960,LastTimestamp:2026-03-10 14:01:38.219920033 +0000 UTC m=+2.783439960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.305145 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b7fb60f770cf5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.220657909 +0000 UTC m=+2.784177836,LastTimestamp:2026-03-10 14:01:38.220657909 +0000 UTC m=+2.784177836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.312355 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb60fa1eff7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.223468535 +0000 UTC m=+2.786988592,LastTimestamp:2026-03-10 14:01:38.223468535 +0000 UTC m=+2.786988592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.315115 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb61cd05164 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.44461194 +0000 UTC m=+3.008131857,LastTimestamp:2026-03-10 14:01:38.44461194 +0000 UTC m=+3.008131857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.318336 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b7fb61cf7f033 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.447208499 +0000 UTC m=+3.010728416,LastTimestamp:2026-03-10 14:01:38.447208499 +0000 UTC m=+3.010728416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.322495 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb61d4822ab openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.452464299 +0000 UTC m=+3.015984216,LastTimestamp:2026-03-10 14:01:38.452464299 +0000 UTC m=+3.015984216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.325945 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb61d85255e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.456462686 +0000 UTC m=+3.019982604,LastTimestamp:2026-03-10 14:01:38.456462686 +0000 UTC m=+3.019982604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.330197 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb61dcd70c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.461200579 +0000 UTC m=+3.024720486,LastTimestamp:2026-03-10 14:01:38.461200579 +0000 UTC m=+3.024720486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.337507 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b7fb61df3818d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.463695245 +0000 UTC m=+3.027215182,LastTimestamp:2026-03-10 14:01:38.463695245 +0000 UTC m=+3.027215182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.344333 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb61e0d281b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.465376283 +0000 UTC m=+3.028896200,LastTimestamp:2026-03-10 14:01:38.465376283 +0000 UTC m=+3.028896200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.351974 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb61e72378a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.47199937 +0000 UTC m=+3.035519287,LastTimestamp:2026-03-10 14:01:38.47199937 +0000 UTC m=+3.035519287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.359954 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb61e8fff15 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.473950997 +0000 UTC m=+3.037470914,LastTimestamp:2026-03-10 14:01:38.473950997 +0000 UTC m=+3.037470914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.366593 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb61f2abcfb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.484092155 +0000 UTC m=+3.047612072,LastTimestamp:2026-03-10 14:01:38.484092155 +0000 UTC m=+3.047612072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.374020 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb61f43e9a8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.485741992 +0000 UTC m=+3.049261909,LastTimestamp:2026-03-10 14:01:38.485741992 +0000 UTC m=+3.049261909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.382371 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb61f628385 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.487747461 +0000 UTC m=+3.051267378,LastTimestamp:2026-03-10 14:01:38.487747461 +0000 UTC m=+3.051267378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.390929 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb629a4b83b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.659858491 +0000 UTC m=+3.223378408,LastTimestamp:2026-03-10 14:01:38.659858491 +0000 UTC m=+3.223378408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.399332 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb629ba68ec openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.66127998 +0000 UTC m=+3.224799887,LastTimestamp:2026-03-10 14:01:38.66127998 +0000 UTC m=+3.224799887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.406539 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb62a7e9dd6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.674138582 +0000 UTC m=+3.237658509,LastTimestamp:2026-03-10 14:01:38.674138582 +0000 UTC m=+3.237658509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.414091 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb62a8ede90 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.675203728 +0000 UTC m=+3.238723645,LastTimestamp:2026-03-10 14:01:38.675203728 +0000 UTC m=+3.238723645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.425822 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb62a9c7065 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.676093029 +0000 UTC m=+3.239612936,LastTimestamp:2026-03-10 14:01:38.676093029 +0000 UTC m=+3.239612936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.431655 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb62aa5e0ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.67671161 +0000 UTC m=+3.240231527,LastTimestamp:2026-03-10 14:01:38.67671161 +0000 UTC m=+3.240231527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.436875 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb637300af9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.887092985 +0000 UTC m=+3.450612902,LastTimestamp:2026-03-10 14:01:38.887092985 +0000 UTC m=+3.450612902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.442093 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb6375adc99 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.889899161 +0000 UTC m=+3.453419078,LastTimestamp:2026-03-10 14:01:38.889899161 +0000 UTC m=+3.453419078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.447260 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb637dea9c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.898536898 +0000 UTC m=+3.462056825,LastTimestamp:2026-03-10 14:01:38.898536898 +0000 UTC m=+3.462056825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.452672 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb637fd6613 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.900551187 +0000 UTC m=+3.464071104,LastTimestamp:2026-03-10 14:01:38.900551187 +0000 UTC m=+3.464071104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.457708 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b7fb6386f6f30 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:38.908024624 +0000 UTC m=+3.471544541,LastTimestamp:2026-03-10 14:01:38.908024624 +0000 UTC m=+3.471544541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.462716 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb642dc720c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.08294094 +0000 UTC m=+3.646460857,LastTimestamp:2026-03-10 14:01:39.08294094 +0000 UTC m=+3.646460857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.467805 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb643a07833 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.095787571 +0000 UTC m=+3.659307488,LastTimestamp:2026-03-10 14:01:39.095787571 +0000 UTC m=+3.659307488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.473085 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb643b64d60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.0972184 +0000 UTC m=+3.660738327,LastTimestamp:2026-03-10 14:01:39.0972184 +0000 UTC m=+3.660738327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.479186 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb64bddc26e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.234021998 +0000 UTC m=+3.797541915,LastTimestamp:2026-03-10 14:01:39.234021998 +0000 UTC m=+3.797541915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.487307 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb64f749db0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.294240176 +0000 UTC m=+3.857760093,LastTimestamp:2026-03-10 14:01:39.294240176 +0000 UTC m=+3.857760093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.495346 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb64ffaab28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.303025448 +0000 UTC m=+3.866545365,LastTimestamp:2026-03-10 14:01:39.303025448 +0000 UTC m=+3.866545365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.502943 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb656b644c1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.415983297 +0000 UTC m=+3.979503214,LastTimestamp:2026-03-10 14:01:39.415983297 +0000 UTC m=+3.979503214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.510969 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6576332c0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.427316416 +0000 UTC m=+3.990836343,LastTimestamp:2026-03-10 14:01:39.427316416 +0000 UTC m=+3.990836343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.518053 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb68989aecc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.26869934 +0000 UTC m=+4.832219247,LastTimestamp:2026-03-10 14:01:40.26869934 +0000 UTC m=+4.832219247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.522756 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb694ec432e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.45970923 +0000 UTC m=+5.023229147,LastTimestamp:2026-03-10 14:01:40.45970923 +0000 UTC m=+5.023229147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.524718 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb69571694f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.468435279 +0000 UTC m=+5.031955196,LastTimestamp:2026-03-10 14:01:40.468435279 +0000 UTC m=+5.031955196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.527882 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6958651b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.469805497 +0000 UTC m=+5.033325414,LastTimestamp:2026-03-10 14:01:40.469805497 +0000 UTC m=+5.033325414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.532236 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb69fce8b27 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.642310951 +0000 UTC m=+5.205830868,LastTimestamp:2026-03-10 14:01:40.642310951 +0000 UTC m=+5.205830868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.537450 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6a07832c2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.653429442 +0000 UTC m=+5.216949359,LastTimestamp:2026-03-10 14:01:40.653429442 +0000 UTC m=+5.216949359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.542181 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6a0868dcf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.654370255 +0000 UTC m=+5.217890172,LastTimestamp:2026-03-10 14:01:40.654370255 +0000 UTC m=+5.217890172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.547099 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6ab80ae17 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.838534679 +0000 UTC m=+5.402054606,LastTimestamp:2026-03-10 14:01:40.838534679 +0000 UTC m=+5.402054606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.552344 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6ac19337e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.848530302 +0000 UTC m=+5.412050229,LastTimestamp:2026-03-10 14:01:40.848530302 +0000 UTC m=+5.412050229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.556752 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6ac311e8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:40.850097806 +0000 UTC m=+5.413617743,LastTimestamp:2026-03-10 14:01:40.850097806 +0000 UTC m=+5.413617743,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.561852 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6b5c7db3b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:41.010971451 +0000 UTC m=+5.574491368,LastTimestamp:2026-03-10 14:01:41.010971451 +0000 UTC m=+5.574491368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.569196 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6b6448eab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:41.019143851 +0000 UTC m=+5.582663768,LastTimestamp:2026-03-10 14:01:41.019143851 +0000 UTC m=+5.582663768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.573926 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6b651dc41 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:41.020015681 +0000 UTC m=+5.583535598,LastTimestamp:2026-03-10 14:01:41.020015681 +0000 UTC m=+5.583535598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.580185 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6bedfc020 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:41.16353232 +0000 UTC m=+5.727052237,LastTimestamp:2026-03-10 14:01:41.16353232 +0000 UTC m=+5.727052237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.585544 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b7fb6bf87aea0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:41.174537888 +0000 UTC m=+5.738057805,LastTimestamp:2026-03-10 14:01:41.174537888 +0000 UTC m=+5.738057805,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.594270 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 14:02:00 crc kubenswrapper[4911]: &Event{ObjectMeta:{kube-controller-manager-crc.189b7fb759b7e4a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 14:02:00 crc kubenswrapper[4911]: body: Mar 10 14:02:00 crc kubenswrapper[4911]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:43.761388704 +0000 UTC m=+8.324908661,LastTimestamp:2026-03-10 14:01:43.761388704 +0000 UTC m=+8.324908661,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 14:02:00 crc kubenswrapper[4911]: > Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.602909 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb759b91085 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:43.761465477 +0000 UTC m=+8.324985434,LastTimestamp:2026-03-10 14:01:43.761465477 +0000 UTC m=+8.324985434,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.611185 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 14:02:00 crc kubenswrapper[4911]: &Event{ObjectMeta:{kube-apiserver-crc.189b7fb8e99f6622 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 14:02:00 crc kubenswrapper[4911]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 14:02:00 crc kubenswrapper[4911]: Mar 10 14:02:00 crc kubenswrapper[4911]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:50.470669858 +0000 UTC m=+15.034189775,LastTimestamp:2026-03-10 14:01:50.470669858 +0000 UTC m=+15.034189775,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 14:02:00 crc kubenswrapper[4911]: > Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.617507 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb8e99ff377 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:50.470706039 +0000 UTC m=+15.034225956,LastTimestamp:2026-03-10 14:01:50.470706039 +0000 UTC m=+15.034225956,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.623157 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b7fb8e99f6622\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 14:02:00 crc kubenswrapper[4911]: &Event{ObjectMeta:{kube-apiserver-crc.189b7fb8e99f6622 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 14:02:00 crc kubenswrapper[4911]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 14:02:00 crc kubenswrapper[4911]: Mar 10 14:02:00 crc kubenswrapper[4911]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:50.470669858 +0000 UTC m=+15.034189775,LastTimestamp:2026-03-10 14:01:50.480279728 +0000 UTC m=+15.043799645,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 14:02:00 crc kubenswrapper[4911]: > Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.629445 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b7fb8e99ff377\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb8e99ff377 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:50.470706039 +0000 UTC m=+15.034225956,LastTimestamp:2026-03-10 14:01:50.480324429 +0000 UTC m=+15.043844346,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.635856 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b7fb643b64d60\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb643b64d60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.0972184 +0000 UTC m=+3.660738327,LastTimestamp:2026-03-10 14:01:51.319793308 +0000 UTC m=+15.883313235,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.641080 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b7fb64f749db0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb64f749db0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.294240176 +0000 UTC m=+3.857760093,LastTimestamp:2026-03-10 14:01:51.589305023 +0000 UTC m=+16.152824940,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.646141 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b7fb64ffaab28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b7fb64ffaab28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:39.303025448 +0000 UTC m=+3.866545365,LastTimestamp:2026-03-10 14:01:51.5985323 +0000 UTC m=+16.162052217,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.652417 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b7fb759b7e4a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 14:02:00 crc kubenswrapper[4911]: &Event{ObjectMeta:{kube-controller-manager-crc.189b7fb759b7e4a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 14:02:00 crc kubenswrapper[4911]: body: Mar 10 14:02:00 crc kubenswrapper[4911]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:43.761388704 +0000 UTC m=+8.324908661,LastTimestamp:2026-03-10 14:01:53.761223949 +0000 UTC m=+18.324743876,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 14:02:00 crc kubenswrapper[4911]: > Mar 10 14:02:00 crc kubenswrapper[4911]: E0310 14:02:00.658085 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b7fb759b91085\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb759b91085 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:43.761465477 +0000 UTC m=+8.324985434,LastTimestamp:2026-03-10 14:01:53.761302932 +0000 UTC m=+18.324822859,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:01 crc kubenswrapper[4911]: I0310 14:02:01.128502 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:02 crc kubenswrapper[4911]: I0310 14:02:02.128483 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:02 crc kubenswrapper[4911]: W0310 14:02:02.664388 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 14:02:02 crc kubenswrapper[4911]: E0310 14:02:02.664486 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.128511 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.760655 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.760804 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.760905 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.761239 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.764286 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.764491 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.764575 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.766847 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.767353 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1" gracePeriod=30 Mar 10 14:02:03 crc kubenswrapper[4911]: E0310 14:02:03.768102 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b7fb759b7e4a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 14:02:03 crc kubenswrapper[4911]: &Event{ObjectMeta:{kube-controller-manager-crc.189b7fb759b7e4a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 14:02:03 crc kubenswrapper[4911]: body: Mar 10 14:02:03 crc kubenswrapper[4911]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:43.761388704 +0000 UTC m=+8.324908661,LastTimestamp:2026-03-10 14:02:03.760760625 +0000 UTC m=+28.324280572,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 14:02:03 crc kubenswrapper[4911]: > Mar 10 14:02:03 crc kubenswrapper[4911]: E0310 14:02:03.771141 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b7fb759b91085\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb759b91085 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:43.761465477 +0000 UTC m=+8.324985434,LastTimestamp:2026-03-10 14:02:03.760858088 +0000 UTC m=+28.324378035,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:03 crc kubenswrapper[4911]: E0310 14:02:03.780144 4911 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fbc022a0700 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:02:03.76731008 +0000 UTC m=+28.330830107,LastTimestamp:2026-03-10 14:02:03.76731008 +0000 UTC m=+28.330830107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.877955 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.880818 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.880908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.880942 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:03 crc kubenswrapper[4911]: I0310 14:02:03.880987 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:02:03 crc kubenswrapper[4911]: E0310 14:02:03.887130 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 14:02:03 crc kubenswrapper[4911]: E0310 14:02:03.887151 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 14:02:03 crc kubenswrapper[4911]: E0310 14:02:03.901476 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b7fb5ea2bd271\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb5ea2bd271 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.594970737 +0000 UTC m=+2.158490654,LastTimestamp:2026-03-10 14:02:03.892464448 +0000 UTC m=+28.455984395,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.128009 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:04 crc kubenswrapper[4911]: E0310 14:02:04.157117 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b7fb6000d8933\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb6000d8933 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.962084659 +0000 UTC m=+2.525604586,LastTimestamp:2026-03-10 14:02:04.148936394 +0000 UTC m=+28.712456321,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:04 crc kubenswrapper[4911]: E0310 14:02:04.173317 4911 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b7fb600d5e802\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b7fb600d5e802 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:01:37.97521613 +0000 UTC m=+2.538736057,LastTimestamp:2026-03-10 14:02:04.165524994 +0000 UTC m=+28.729044951,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.371493 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.372190 4911 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1" exitCode=255 Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.372265 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1"} Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.372342 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59"} Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.372502 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.373835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.373907 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:04 crc kubenswrapper[4911]: I0310 14:02:04.373928 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:04 crc kubenswrapper[4911]: W0310 14:02:04.608709 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:04 crc kubenswrapper[4911]: E0310 14:02:04.608852 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 14:02:05 crc kubenswrapper[4911]: I0310 14:02:05.127563 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:05 crc kubenswrapper[4911]: W0310 14:02:05.900119 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 14:02:05 crc kubenswrapper[4911]: E0310 14:02:05.900191 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 14:02:06 crc kubenswrapper[4911]: I0310 14:02:06.126774 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:06 crc kubenswrapper[4911]: E0310 14:02:06.312043 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 14:02:06 crc kubenswrapper[4911]: I0310 14:02:06.555685 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:02:06 crc kubenswrapper[4911]: I0310 14:02:06.556183 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:06 crc kubenswrapper[4911]: I0310 14:02:06.558996 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:06 crc kubenswrapper[4911]: I0310 14:02:06.559091 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:06 crc kubenswrapper[4911]: I0310 14:02:06.559134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:07 crc kubenswrapper[4911]: I0310 14:02:07.127195 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:08 crc kubenswrapper[4911]: I0310 14:02:08.125921 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:09 crc kubenswrapper[4911]: I0310 14:02:09.126424 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.131509 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.193257 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.195275 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.195369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.195396 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.196654 4911 scope.go:117] "RemoveContainer" containerID="674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.760320 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.761129 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.762523 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.762562 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.762578 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.771662 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.888156 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.890210 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.890267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.890285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:10 crc kubenswrapper[4911]: I0310 14:02:10.890328 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:02:10 crc kubenswrapper[4911]: E0310 14:02:10.893383 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 14:02:10 crc kubenswrapper[4911]: E0310 14:02:10.893462 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.124690 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.399180 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.400797 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.401226 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53"} Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.401313 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.401851 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.401874 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.401883 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.409299 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.409365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:11 crc kubenswrapper[4911]: I0310 14:02:11.409389 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.124620 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.407994 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.409369 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.411995 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53" exitCode=255 Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.412072 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53"} Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.412157 4911 scope.go:117] "RemoveContainer" containerID="674dac9f9aef1286090b1f084437944cac4d7ae7d6917bc203d5adac05b8bedc" Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.412441 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.414025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.414063 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.414076 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:12 crc kubenswrapper[4911]: I0310 14:02:12.415678 4911 scope.go:117] "RemoveContainer" containerID="cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53" Mar 10 14:02:12 crc kubenswrapper[4911]: E0310 14:02:12.416041 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:13 crc kubenswrapper[4911]: I0310 14:02:13.124716 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:13 crc kubenswrapper[4911]: I0310 14:02:13.416375 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 14:02:13 crc kubenswrapper[4911]: W0310 14:02:13.655429 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 14:02:13 crc kubenswrapper[4911]: E0310 14:02:13.655531 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 14:02:13 crc kubenswrapper[4911]: I0310 14:02:13.724937 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:02:13 crc kubenswrapper[4911]: I0310 14:02:13.725909 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:13 crc kubenswrapper[4911]: I0310 14:02:13.727701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:13 crc kubenswrapper[4911]: I0310 14:02:13.727816 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:13 crc kubenswrapper[4911]: I0310 14:02:13.727838 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:13 crc kubenswrapper[4911]: I0310 14:02:13.728821 4911 scope.go:117] "RemoveContainer" containerID="cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53" Mar 10 14:02:13 crc kubenswrapper[4911]: E0310 14:02:13.729222 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:14 crc kubenswrapper[4911]: I0310 14:02:14.122603 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:15 crc kubenswrapper[4911]: I0310 14:02:15.125670 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:15 crc kubenswrapper[4911]: W0310 14:02:15.824681 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 14:02:15 crc kubenswrapper[4911]: E0310 14:02:15.824816 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.092886 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.093183 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.095145 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.095215 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.095230 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.096116 4911 scope.go:117] "RemoveContainer" containerID="cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53" Mar 10 14:02:16 crc kubenswrapper[4911]: E0310 14:02:16.096391 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.126051 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:16 crc kubenswrapper[4911]: E0310 14:02:16.312230 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.560919 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.561116 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.562878 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.562954 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:16 crc kubenswrapper[4911]: I0310 14:02:16.562980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:17 crc kubenswrapper[4911]: I0310 14:02:17.120370 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:17 crc kubenswrapper[4911]: I0310 14:02:17.894410 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:17 crc kubenswrapper[4911]: I0310 14:02:17.896954 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:17 crc kubenswrapper[4911]: I0310 14:02:17.897024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:17 crc kubenswrapper[4911]: I0310 14:02:17.897062 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:17 crc kubenswrapper[4911]: I0310 14:02:17.897107 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:02:17 crc kubenswrapper[4911]: E0310 14:02:17.903335 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 14:02:17 crc kubenswrapper[4911]: E0310 14:02:17.903440 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 14:02:18 crc kubenswrapper[4911]: I0310 14:02:18.123703 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:19 crc kubenswrapper[4911]: I0310 14:02:19.126531 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:20 crc kubenswrapper[4911]: I0310 14:02:20.127853 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:21 crc kubenswrapper[4911]: I0310 14:02:21.121064 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:22 crc kubenswrapper[4911]: I0310 14:02:22.126623 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:23 crc kubenswrapper[4911]: I0310 14:02:23.127577 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:24 crc kubenswrapper[4911]: I0310 14:02:24.127616 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:24 crc kubenswrapper[4911]: W0310 14:02:24.248036 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 14:02:24 crc kubenswrapper[4911]: E0310 14:02:24.248123 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 14:02:24 crc kubenswrapper[4911]: I0310 14:02:24.904296 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:24 crc kubenswrapper[4911]: I0310 14:02:24.906038 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:24 crc kubenswrapper[4911]: I0310 14:02:24.906098 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:24 crc kubenswrapper[4911]: I0310 14:02:24.906115 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:24 crc kubenswrapper[4911]: I0310 14:02:24.906153 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:02:24 crc kubenswrapper[4911]: E0310 14:02:24.913132 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 14:02:24 crc kubenswrapper[4911]: E0310 14:02:24.913298 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 14:02:25 crc kubenswrapper[4911]: I0310 14:02:25.126091 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:25 crc kubenswrapper[4911]: I0310 14:02:25.582487 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 14:02:25 crc kubenswrapper[4911]: I0310 14:02:25.582777 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:25 crc kubenswrapper[4911]: I0310 14:02:25.584380 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:25 crc kubenswrapper[4911]: I0310 14:02:25.584439 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:25 crc kubenswrapper[4911]: I0310 14:02:25.584461 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:26 crc kubenswrapper[4911]: I0310 14:02:26.127979 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:26 crc kubenswrapper[4911]: E0310 14:02:26.312408 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 14:02:27 crc kubenswrapper[4911]: I0310 14:02:27.127077 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:27 crc kubenswrapper[4911]: I0310 14:02:27.193300 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:27 crc kubenswrapper[4911]: I0310 14:02:27.195350 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:27 crc kubenswrapper[4911]: I0310 14:02:27.195414 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:27 crc kubenswrapper[4911]: I0310 14:02:27.195431 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:27 crc kubenswrapper[4911]: I0310 14:02:27.196489 4911 scope.go:117] "RemoveContainer" containerID="cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53" Mar 10 14:02:27 crc kubenswrapper[4911]: E0310 14:02:27.196845 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:28 crc kubenswrapper[4911]: I0310 14:02:28.127059 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:28 crc kubenswrapper[4911]: W0310 14:02:28.719100 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:28 crc kubenswrapper[4911]: E0310 14:02:28.719159 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 14:02:29 crc kubenswrapper[4911]: I0310 14:02:29.123248 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:30 crc kubenswrapper[4911]: I0310 14:02:30.126666 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:31 crc kubenswrapper[4911]: I0310 14:02:31.128214 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:31 crc kubenswrapper[4911]: I0310 14:02:31.913793 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:31 crc kubenswrapper[4911]: I0310 14:02:31.915965 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:31 crc kubenswrapper[4911]: I0310 14:02:31.916054 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:31 crc kubenswrapper[4911]: I0310 14:02:31.916082 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:31 crc kubenswrapper[4911]: I0310 14:02:31.916140 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:02:31 crc kubenswrapper[4911]: E0310 14:02:31.922420 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 14:02:31 crc kubenswrapper[4911]: E0310 14:02:31.922462 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 14:02:32 crc kubenswrapper[4911]: I0310 14:02:32.121124 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:33 crc kubenswrapper[4911]: I0310 14:02:33.127336 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:34 crc kubenswrapper[4911]: I0310 14:02:34.124024 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:35 crc kubenswrapper[4911]: I0310 14:02:35.134492 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:36 crc kubenswrapper[4911]: I0310 14:02:36.128188 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:36 crc kubenswrapper[4911]: E0310 14:02:36.313042 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 14:02:37 crc kubenswrapper[4911]: I0310 14:02:37.125258 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:38 crc kubenswrapper[4911]: I0310 14:02:38.127281 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:38 crc kubenswrapper[4911]: I0310 14:02:38.923547 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:38 crc kubenswrapper[4911]: I0310 14:02:38.925677 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:38 crc kubenswrapper[4911]: I0310 14:02:38.925749 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:38 crc kubenswrapper[4911]: I0310 14:02:38.925761 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:38 crc kubenswrapper[4911]: I0310 14:02:38.925802 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:02:38 crc kubenswrapper[4911]: E0310 14:02:38.933326 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 14:02:38 crc kubenswrapper[4911]: E0310 14:02:38.933647 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 14:02:39 crc kubenswrapper[4911]: I0310 14:02:39.127474 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:40 crc kubenswrapper[4911]: I0310 14:02:40.127633 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 14:02:40 crc kubenswrapper[4911]: I0310 14:02:40.797559 4911 csr.go:261] certificate signing request csr-7vhmk is approved, waiting to be issued Mar 10 14:02:40 crc kubenswrapper[4911]: I0310 14:02:40.806888 4911 csr.go:257] certificate signing request csr-7vhmk is issued Mar 10 14:02:40 crc kubenswrapper[4911]: I0310 14:02:40.879015 4911 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 14:02:40 crc kubenswrapper[4911]: I0310 14:02:40.946355 4911 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 14:02:41 crc kubenswrapper[4911]: I0310 14:02:41.808523 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-12 16:32:34.243845164 +0000 UTC Mar 10 14:02:41 crc kubenswrapper[4911]: I0310 14:02:41.808584 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6650h29m52.435267369s for next certificate rotation Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.193064 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.194592 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.194650 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.194669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.196040 4911 scope.go:117] "RemoveContainer" containerID="cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.508225 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.510330 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666"} Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.510540 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.511611 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.511646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:42 crc kubenswrapper[4911]: I0310 14:02:42.511661 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.513516 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.514075 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.515468 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" exitCode=255 Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.515511 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666"} Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.515553 4911 scope.go:117] "RemoveContainer" containerID="cf0d0f93e011867da32a4d68f5591d1478e03acd09831cdcfcfd6b56ea022a53" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.515697 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.516619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.516650 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.516660 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.517283 4911 scope.go:117] "RemoveContainer" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" Mar 10 14:02:43 crc kubenswrapper[4911]: E0310 14:02:43.517458 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:43 crc kubenswrapper[4911]: I0310 14:02:43.724872 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:02:44 crc kubenswrapper[4911]: I0310 14:02:44.520354 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 14:02:44 crc kubenswrapper[4911]: I0310 14:02:44.522066 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:44 crc kubenswrapper[4911]: I0310 14:02:44.523248 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:44 crc kubenswrapper[4911]: I0310 14:02:44.523296 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:44 crc kubenswrapper[4911]: I0310 14:02:44.523311 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:44 crc kubenswrapper[4911]: I0310 14:02:44.524027 4911 scope.go:117] "RemoveContainer" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" Mar 10 14:02:44 crc kubenswrapper[4911]: E0310 14:02:44.524205 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.933931 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.935710 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.935789 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.935803 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.936002 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.946236 4911 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.946590 4911 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 14:02:45 crc kubenswrapper[4911]: E0310 14:02:45.946635 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.951260 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.951294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.951304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.951320 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.951333 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:45Z","lastTransitionTime":"2026-03-10T14:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:45 crc kubenswrapper[4911]: E0310 14:02:45.976421 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.985598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.985658 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.985673 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.985695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:45 crc kubenswrapper[4911]: I0310 14:02:45.985712 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:45Z","lastTransitionTime":"2026-03-10T14:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.001582 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.015332 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.015385 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.015417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.015441 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.015460 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:46Z","lastTransitionTime":"2026-03-10T14:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.027217 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.039497 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.039554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.039572 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.039600 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.039619 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:46Z","lastTransitionTime":"2026-03-10T14:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.060717 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.060931 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.060977 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.092646 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.092948 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.094366 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.094433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.094454 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:46 crc kubenswrapper[4911]: I0310 14:02:46.095774 4911 scope.go:117] "RemoveContainer" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.096108 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.162051 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.262823 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.313270 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.363551 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.464086 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.564812 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.665390 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.766171 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.866411 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:46 crc kubenswrapper[4911]: E0310 14:02:46.967236 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:47 crc kubenswrapper[4911]: E0310 14:02:47.068162 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:47 crc kubenswrapper[4911]: E0310 14:02:47.168548 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:47 crc kubenswrapper[4911]: E0310 14:02:47.269584 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:47 crc kubenswrapper[4911]: E0310 14:02:47.369882 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:47 crc kubenswrapper[4911]: E0310 14:02:47.470248 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:47 crc kubenswrapper[4911]: E0310 14:02:47.570816 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.657643 4911 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.673953 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.673989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.673997 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.674009 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.674018 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:47Z","lastTransitionTime":"2026-03-10T14:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.776482 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.776526 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.776536 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.776553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.776564 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:47Z","lastTransitionTime":"2026-03-10T14:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.878703 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.878793 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.878811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.878837 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.878861 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:47Z","lastTransitionTime":"2026-03-10T14:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.982021 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.982113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.982134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.982164 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:47 crc kubenswrapper[4911]: I0310 14:02:47.982185 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:47Z","lastTransitionTime":"2026-03-10T14:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.091635 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.091688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.091699 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.091715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.091742 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.140824 4911 apiserver.go:52] "Watching apiserver" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.149523 4911 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.150330 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-z255c","openshift-multus/multus-nsxjn","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-machine-config-operator/machine-config-daemon-tg8sx","openshift-multus/network-metrics-daemon-r28f8","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8","openshift-dns/node-resolver-9p6ll","openshift-image-registry/node-ca-vfj7m","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-4256n","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.150871 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.150987 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.151399 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.151458 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.151531 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.151906 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.152261 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.152338 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.152585 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.152618 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9p6ll" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.152638 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.152685 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.152753 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.153380 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.153426 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.153705 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.154238 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.160222 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.162623 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.162700 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.162819 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.163108 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.163343 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.163592 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.163786 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.163954 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164068 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164129 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164156 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164071 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164213 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164291 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164301 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164356 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164382 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164406 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164497 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164534 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164554 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164576 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164633 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164655 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164667 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164813 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164921 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164986 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.165079 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164499 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.164499 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.165301 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.165467 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.165518 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.166418 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.171858 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.172493 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.185712 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.195537 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.195593 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.196343 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.196394 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.196990 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.204810 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.221259 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223495 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223576 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-netns\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223612 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-k8s-cni-cncf-io\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223649 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp4ld\" (UniqueName: \"kubernetes.io/projected/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-kube-api-access-zp4ld\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223684 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjz78\" (UniqueName: \"kubernetes.io/projected/349ee2ee-803a-404b-9aa2-2230eabdbb56-kube-api-access-gjz78\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223717 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2970cff-e2bc-40e6-9d80-7388d88e840e-proxy-tls\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223786 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-kubelet\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223821 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223855 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/349ee2ee-803a-404b-9aa2-2230eabdbb56-cni-binary-copy\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223886 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e2970cff-e2bc-40e6-9d80-7388d88e840e-rootfs\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223915 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-env-overrides\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223950 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-systemd-units\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.223983 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-script-lib\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224014 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-system-cni-dir\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224045 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224076 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cae19a2-5349-4f13-94ab-bfe066e4589a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224107 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-var-lib-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224209 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-os-release\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224249 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-node-log\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224289 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-config\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224330 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed2b430b-2281-4231-9135-f0289be08cdd-ovn-node-metrics-cert\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224370 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0fdaa42-a77f-4f62-b94f-6659225e12af-hosts-file\") pod \"node-resolver-9p6ll\" (UID: \"a0fdaa42-a77f-4f62-b94f-6659225e12af\") " pod="openshift-dns/node-resolver-9p6ll" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224404 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224437 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cae19a2-5349-4f13-94ab-bfe066e4589a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224793 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-slash\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.224965 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc662696-d402-4969-bebd-00fa42e63075-cni-binary-copy\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225003 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-socket-dir-parent\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225020 4911 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225033 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2970cff-e2bc-40e6-9d80-7388d88e840e-mcd-auth-proxy-config\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225060 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225096 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n2ph\" (UniqueName: \"kubernetes.io/projected/7cae19a2-5349-4f13-94ab-bfe066e4589a-kube-api-access-9n2ph\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225124 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-log-socket\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225146 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-cnibin\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225166 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-systemd\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225219 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225264 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225302 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sf4\" (UniqueName: \"kubernetes.io/projected/a0fdaa42-a77f-4f62-b94f-6659225e12af-kube-api-access-87sf4\") pod \"node-resolver-9p6ll\" (UID: \"a0fdaa42-a77f-4f62-b94f-6659225e12af\") " pod="openshift-dns/node-resolver-9p6ll" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.225326 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.225432 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225402 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgrqg\" (UniqueName: \"kubernetes.io/projected/e2970cff-e2bc-40e6-9d80-7388d88e840e-kube-api-access-qgrqg\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225467 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cae19a2-5349-4f13-94ab-bfe066e4589a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.225539 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:48.725502539 +0000 UTC m=+73.289022486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225578 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-etc-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225603 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-system-cni-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225625 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-cnibin\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.225673 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:48.725651392 +0000 UTC m=+73.289171339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225718 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-ovn\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225800 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225836 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-bin\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225871 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh7d8\" (UniqueName: \"kubernetes.io/projected/ed2b430b-2281-4231-9135-f0289be08cdd-kube-api-access-rh7d8\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225901 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-os-release\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225943 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-cni-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.225975 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/349ee2ee-803a-404b-9aa2-2230eabdbb56-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.226006 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.226028 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-netd\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.234077 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.238410 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.238462 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.238482 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.238614 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:48.738556875 +0000 UTC m=+73.302076802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.242209 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.242260 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.242283 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.242437 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:48.742409327 +0000 UTC m=+73.305929274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.258221 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.273273 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.286104 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.330854 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331105 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331220 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331368 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331389 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331544 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331573 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331599 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331627 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331743 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331782 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.331856 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332003 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332167 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332182 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332211 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332251 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332298 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332378 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332451 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332489 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332604 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332596 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332635 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332658 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.332740 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333005 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333053 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333153 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333280 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333303 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333298 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333314 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333414 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333519 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333535 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333538 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333615 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333662 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.333952 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334037 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334131 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334307 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334321 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334388 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334589 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334772 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334792 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334819 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334826 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334866 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.334918 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335214 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335333 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335376 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335403 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335402 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335426 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335451 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335456 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335541 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.335645 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336043 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336082 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336096 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336287 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336106 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.337313 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.337712 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.338096 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336383 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336581 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336649 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336837 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336296 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.336909 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.337054 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.337150 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.337227 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.337649 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.338020 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.338260 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.338283 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.338484 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.338632 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.338927 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339003 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339045 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339075 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339103 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339104 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339125 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339352 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339390 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339409 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339415 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339600 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339626 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339626 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339648 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339684 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339710 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339747 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339771 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339904 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339934 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339956 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.339983 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340002 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340024 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340047 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340108 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340127 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340144 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340163 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340200 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340224 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340244 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340266 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340286 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340306 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340331 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340332 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340353 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340379 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340679 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340698 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340730 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340750 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340772 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340794 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340818 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340843 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340864 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340883 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340901 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340921 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340940 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340959 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341000 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341023 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341063 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341084 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341104 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341124 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341147 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341167 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341185 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341205 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341223 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341242 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341340 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341360 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341382 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341404 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341425 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341450 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341475 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341498 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341516 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341536 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341552 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341568 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341584 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341602 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341621 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341637 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341658 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341676 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341693 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341743 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341762 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341782 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341804 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341822 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341842 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341917 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341939 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341961 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341982 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342003 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342024 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342050 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342072 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342094 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342114 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342133 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342154 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342176 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342195 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342239 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342258 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342299 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342322 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342341 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342359 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342379 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342398 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342430 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342451 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342470 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342489 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342508 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342527 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342549 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342570 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342587 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342604 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342622 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342640 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342778 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342801 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342819 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342837 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342856 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342873 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342891 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342908 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342927 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342945 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342966 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342984 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343003 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343024 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343048 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343076 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343094 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343112 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343130 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343147 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343165 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343183 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343201 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343220 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343237 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343254 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343275 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343295 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343313 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343332 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343350 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343368 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343386 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343404 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343566 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343594 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-serviceca\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343615 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-system-cni-dir\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343634 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-systemd-units\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343651 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-script-lib\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343672 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-var-lib-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343690 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-etc-kubernetes\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343707 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343766 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343788 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cae19a2-5349-4f13-94ab-bfe066e4589a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343808 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-node-log\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343828 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-config\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343847 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed2b430b-2281-4231-9135-f0289be08cdd-ovn-node-metrics-cert\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343866 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-kubelet\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343887 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343906 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-os-release\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343926 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0fdaa42-a77f-4f62-b94f-6659225e12af-hosts-file\") pod \"node-resolver-9p6ll\" (UID: \"a0fdaa42-a77f-4f62-b94f-6659225e12af\") " pod="openshift-dns/node-resolver-9p6ll" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343945 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc662696-d402-4969-bebd-00fa42e63075-multus-daemon-config\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343965 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343985 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-slash\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344004 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc662696-d402-4969-bebd-00fa42e63075-cni-binary-copy\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344028 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-socket-dir-parent\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344047 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-cni-multus\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344070 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344096 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344114 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cae19a2-5349-4f13-94ab-bfe066e4589a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344132 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-cni-bin\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344149 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-conf-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344167 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxrrk\" (UniqueName: \"kubernetes.io/projected/fc662696-d402-4969-bebd-00fa42e63075-kube-api-access-jxrrk\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344189 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2970cff-e2bc-40e6-9d80-7388d88e840e-mcd-auth-proxy-config\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344211 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344251 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-cnibin\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344275 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n2ph\" (UniqueName: \"kubernetes.io/projected/7cae19a2-5349-4f13-94ab-bfe066e4589a-kube-api-access-9n2ph\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344294 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-log-socket\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344312 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-systemd\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344332 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87sf4\" (UniqueName: \"kubernetes.io/projected/a0fdaa42-a77f-4f62-b94f-6659225e12af-kube-api-access-87sf4\") pod \"node-resolver-9p6ll\" (UID: \"a0fdaa42-a77f-4f62-b94f-6659225e12af\") " pod="openshift-dns/node-resolver-9p6ll" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344351 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrqg\" (UniqueName: \"kubernetes.io/projected/e2970cff-e2bc-40e6-9d80-7388d88e840e-kube-api-access-qgrqg\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344368 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cae19a2-5349-4f13-94ab-bfe066e4589a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344387 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-etc-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344406 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-system-cni-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344428 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-cnibin\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344470 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344490 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-netns\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344508 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-bin\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344529 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh7d8\" (UniqueName: \"kubernetes.io/projected/ed2b430b-2281-4231-9135-f0289be08cdd-kube-api-access-rh7d8\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344548 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-os-release\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344567 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-host\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344588 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-ovn\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344609 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344634 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-cni-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344654 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344672 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/349ee2ee-803a-404b-9aa2-2230eabdbb56-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344702 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-netd\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344732 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344785 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344807 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344828 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dplb\" (UniqueName: \"kubernetes.io/projected/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-kube-api-access-9dplb\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344846 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-k8s-cni-cncf-io\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344862 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-multus-certs\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344891 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-netns\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344931 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjz78\" (UniqueName: \"kubernetes.io/projected/349ee2ee-803a-404b-9aa2-2230eabdbb56-kube-api-access-gjz78\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344951 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2970cff-e2bc-40e6-9d80-7388d88e840e-proxy-tls\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344969 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-kubelet\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344987 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345020 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp4ld\" (UniqueName: \"kubernetes.io/projected/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-kube-api-access-zp4ld\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345041 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-env-overrides\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345060 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-hostroot\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345083 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/349ee2ee-803a-404b-9aa2-2230eabdbb56-cni-binary-copy\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345100 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e2970cff-e2bc-40e6-9d80-7388d88e840e-rootfs\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345178 4911 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345191 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345206 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345218 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345231 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345243 4911 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345254 4911 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345265 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345276 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345287 4911 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345298 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345310 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345320 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345331 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345341 4911 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345353 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345363 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345372 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345382 4911 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345409 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345420 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345430 4911 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345441 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345452 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345462 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345472 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345481 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345491 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345503 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345512 4911 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345524 4911 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345534 4911 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345545 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345554 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345565 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345575 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345586 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345596 4911 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345607 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345618 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345628 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345639 4911 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345651 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345661 4911 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345758 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e2970cff-e2bc-40e6-9d80-7388d88e840e-rootfs\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346410 4911 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346495 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2970cff-e2bc-40e6-9d80-7388d88e840e-mcd-auth-proxy-config\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346551 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346588 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-cnibin\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.347467 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-log-socket\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.347504 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-systemd\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340441 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340600 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340750 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340832 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.340818 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341239 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341343 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341357 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341640 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.341996 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342036 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342313 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342458 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342825 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.342993 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343038 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343054 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343240 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343255 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.343981 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344213 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344516 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344555 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.344602 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345002 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345171 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345156 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345573 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345603 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345658 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345881 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.345977 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346128 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346153 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346357 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346573 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.346809 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.347045 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.347097 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.347261 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.347466 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.347800 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.348418 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.348715 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.349074 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.349593 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.349974 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.350008 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.350329 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.350524 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.350918 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.350932 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.351225 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.351270 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.351305 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.351500 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.351615 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.351869 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.351881 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.351970 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.358798 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.352331 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.352466 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.353074 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.353042 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.353642 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.354191 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.354349 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.354913 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.355002 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.355494 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.355666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.355926 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.356105 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.356324 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.356700 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.356974 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.357265 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.357296 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.357832 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.357895 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.358190 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.358560 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.359016 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.359039 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.359307 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.359379 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.359541 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.359567 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.359767 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.359917 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.360038 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.360077 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cae19a2-5349-4f13-94ab-bfe066e4589a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.360867 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.360879 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.361232 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.361384 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.362229 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.362651 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.363051 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.363695 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.363774 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.363827 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-node-log\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.364389 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-config\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.365035 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.365187 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-k8s-cni-cncf-io\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.365279 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-netns\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.366164 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.366343 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.350125 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-system-cni-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.358988 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cae19a2-5349-4f13-94ab-bfe066e4589a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.366448 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-cnibin\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.366967 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n2ph\" (UniqueName: \"kubernetes.io/projected/7cae19a2-5349-4f13-94ab-bfe066e4589a-kube-api-access-9n2ph\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.350089 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-etc-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.368297 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-kubelet\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.368318 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.369059 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-env-overrides\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.369680 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/349ee2ee-803a-404b-9aa2-2230eabdbb56-cni-binary-copy\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.369772 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-var-lib-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.370351 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-os-release\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.370422 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0fdaa42-a77f-4f62-b94f-6659225e12af-hosts-file\") pod \"node-resolver-9p6ll\" (UID: \"a0fdaa42-a77f-4f62-b94f-6659225e12af\") " pod="openshift-dns/node-resolver-9p6ll" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.372155 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cae19a2-5349-4f13-94ab-bfe066e4589a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xxbj8\" (UID: \"7cae19a2-5349-4f13-94ab-bfe066e4589a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.372340 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.372403 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-slash\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.372361 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-socket-dir-parent\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.373196 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.373286 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:48.873254932 +0000 UTC m=+73.436775069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.373381 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.374513 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2970cff-e2bc-40e6-9d80-7388d88e840e-proxy-tls\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.374886 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.374989 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed2b430b-2281-4231-9135-f0289be08cdd-ovn-node-metrics-cert\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.375192 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.376709 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.377130 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.377423 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.377879 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.377968 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/349ee2ee-803a-404b-9aa2-2230eabdbb56-system-cni-dir\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.378005 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-systemd-units\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.378337 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrqg\" (UniqueName: \"kubernetes.io/projected/e2970cff-e2bc-40e6-9d80-7388d88e840e-kube-api-access-qgrqg\") pod \"machine-config-daemon-tg8sx\" (UID: \"e2970cff-e2bc-40e6-9d80-7388d88e840e\") " pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.378513 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-script-lib\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.381857 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.381951 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.382093 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-bin\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.382987 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-os-release\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.383177 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-ovn\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.383519 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-cni-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.385519 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.385319 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.385744 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.385997 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/349ee2ee-803a-404b-9aa2-2230eabdbb56-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.386035 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-netd\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.386351 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.386869 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.387284 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.387860 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.390885 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.391135 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sf4\" (UniqueName: \"kubernetes.io/projected/a0fdaa42-a77f-4f62-b94f-6659225e12af-kube-api-access-87sf4\") pod \"node-resolver-9p6ll\" (UID: \"a0fdaa42-a77f-4f62-b94f-6659225e12af\") " pod="openshift-dns/node-resolver-9p6ll" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.393329 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp4ld\" (UniqueName: \"kubernetes.io/projected/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-kube-api-access-zp4ld\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.393528 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.394497 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.395012 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.396184 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:02:48.89615847 +0000 UTC m=+73.459678387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.396243 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.396468 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjz78\" (UniqueName: \"kubernetes.io/projected/349ee2ee-803a-404b-9aa2-2230eabdbb56-kube-api-access-gjz78\") pod \"multus-additional-cni-plugins-z255c\" (UID: \"349ee2ee-803a-404b-9aa2-2230eabdbb56\") " pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.397391 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.375551 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc662696-d402-4969-bebd-00fa42e63075-cni-binary-copy\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.398083 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.398923 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.399009 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.399128 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.399884 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.400298 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.400585 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.400709 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.400866 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.400968 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.401124 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.399905 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.402391 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.402483 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh7d8\" (UniqueName: \"kubernetes.io/projected/ed2b430b-2281-4231-9135-f0289be08cdd-kube-api-access-rh7d8\") pod \"ovnkube-node-4256n\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.403305 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.404137 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.404132 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.404227 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.404779 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.404613 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.405074 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.405570 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.406989 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.407956 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.408057 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.408911 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.409581 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.409621 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.409815 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.410045 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.410410 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.410561 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.411013 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.412327 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.415795 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.416437 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.419485 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.425409 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.428388 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.430940 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.440341 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.440366 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.440376 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.440392 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.440401 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.441792 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.446420 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.446609 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-netns\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.446801 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-host\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.446991 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447185 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dplb\" (UniqueName: \"kubernetes.io/projected/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-kube-api-access-9dplb\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447316 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-multus-certs\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447434 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-multus-certs\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447033 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.446557 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.446873 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-host\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.446685 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-run-netns\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447440 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-hostroot\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447550 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-serviceca\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447581 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-etc-kubernetes\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447610 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-kubelet\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447639 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc662696-d402-4969-bebd-00fa42e63075-multus-daemon-config\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447652 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-etc-kubernetes\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447667 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-cni-multus\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447684 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-kubelet\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447714 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-cni-multus\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447715 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-conf-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447752 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-multus-conf-dir\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447769 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxrrk\" (UniqueName: \"kubernetes.io/projected/fc662696-d402-4969-bebd-00fa42e63075-kube-api-access-jxrrk\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447801 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-cni-bin\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447904 4911 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447921 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447936 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447949 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447964 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447978 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.447991 4911 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448004 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448017 4911 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448030 4911 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448042 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448054 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448066 4911 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448079 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448087 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448097 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448106 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448115 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448124 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448135 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448144 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448154 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448164 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448175 4911 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448184 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448194 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448203 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448213 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448222 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448232 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448241 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448250 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448260 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448270 4911 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448279 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448284 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fc662696-d402-4969-bebd-00fa42e63075-multus-daemon-config\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448288 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448327 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448342 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448352 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448362 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448373 4911 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448381 4911 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448391 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448404 4911 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448417 4911 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448429 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448440 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448451 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448460 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448468 4911 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448477 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448488 4911 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448500 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448516 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448308 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-host-var-lib-cni-bin\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448528 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448617 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448628 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448637 4911 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448646 4911 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448655 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448664 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448673 4911 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448683 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448692 4911 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448702 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448714 4911 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448750 4911 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448765 4911 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448778 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448793 4911 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448806 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448689 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-serviceca\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448822 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448896 4911 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448908 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448919 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448930 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448941 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448952 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448962 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448971 4911 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448980 4911 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448989 4911 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.448998 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449007 4911 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449017 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449027 4911 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449036 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449049 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449059 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449067 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449075 4911 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449085 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449093 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449103 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449114 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449123 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449132 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449142 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449151 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449167 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449176 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449186 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449195 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449203 4911 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449212 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449222 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449232 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449240 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449249 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449258 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449266 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449276 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449285 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449294 4911 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449303 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449312 4911 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449320 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449329 4911 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449338 4911 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449350 4911 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449359 4911 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449368 4911 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449377 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449386 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449395 4911 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449405 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449413 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449422 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449431 4911 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449440 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449448 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449457 4911 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449466 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449474 4911 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449483 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449492 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449501 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449510 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449519 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449528 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449537 4911 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449546 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449553 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449564 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449572 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449582 4911 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449590 4911 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449599 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449608 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.449647 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.452689 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.456105 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fc662696-d402-4969-bebd-00fa42e63075-hostroot\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.464796 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dplb\" (UniqueName: \"kubernetes.io/projected/acf8e218-4a2a-4d62-9aa8-7ecca1109d35-kube-api-access-9dplb\") pod \"node-ca-vfj7m\" (UID: \"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\") " pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.464928 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxrrk\" (UniqueName: \"kubernetes.io/projected/fc662696-d402-4969-bebd-00fa42e63075-kube-api-access-jxrrk\") pod \"multus-nsxjn\" (UID: \"fc662696-d402-4969-bebd-00fa42e63075\") " pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.474482 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.483406 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 14:02:48 crc kubenswrapper[4911]: W0310 14:02:48.489922 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-5708298891d41808a183d4bc0a693c22f472ace5c3a62024ff01ca815cd648b6 WatchSource:0}: Error finding container 5708298891d41808a183d4bc0a693c22f472ace5c3a62024ff01ca815cd648b6: Status 404 returned error can't find the container with id 5708298891d41808a183d4bc0a693c22f472ace5c3a62024ff01ca815cd648b6 Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.494440 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 14:02:48 crc kubenswrapper[4911]: set -o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: source /etc/kubernetes/apiserver-url.env Mar 10 14:02:48 crc kubenswrapper[4911]: else Mar 10 14:02:48 crc kubenswrapper[4911]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 14:02:48 crc kubenswrapper[4911]: exit 1 Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.495332 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.495696 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 14:02:48 crc kubenswrapper[4911]: W0310 14:02:48.495815 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-4fbb1f39fb7ec1aa274eb64b5fb5e4458f64a1536ce22dee64cd93d2bc6fc59e WatchSource:0}: Error finding container 4fbb1f39fb7ec1aa274eb64b5fb5e4458f64a1536ce22dee64cd93d2bc6fc59e: Status 404 returned error can't find the container with id 4fbb1f39fb7ec1aa274eb64b5fb5e4458f64a1536ce22dee64cd93d2bc6fc59e Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.506927 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.506848 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: set -o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:02:48 crc kubenswrapper[4911]: set +o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 14:02:48 crc kubenswrapper[4911]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 14:02:48 crc kubenswrapper[4911]: ho_enable="--enable-hybrid-overlay" Mar 10 14:02:48 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 14:02:48 crc kubenswrapper[4911]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 14:02:48 crc kubenswrapper[4911]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --webhook-host=127.0.0.1 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --webhook-port=9743 \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ho_enable} \ Mar 10 14:02:48 crc kubenswrapper[4911]: --enable-interconnect \ Mar 10 14:02:48 crc kubenswrapper[4911]: --disable-approver \ Mar 10 14:02:48 crc kubenswrapper[4911]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --wait-for-kubernetes-api=200s \ Mar 10 14:02:48 crc kubenswrapper[4911]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --loglevel="${LOGLEVEL}" Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: W0310 14:02:48.507481 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cae19a2_5349_4f13_94ab_bfe066e4589a.slice/crio-56b247c57a86fe1161b1e83d73c60c39c6f5eb2eddcd3122fa374e2b07e3e562 WatchSource:0}: Error finding container 56b247c57a86fe1161b1e83d73c60c39c6f5eb2eddcd3122fa374e2b07e3e562: Status 404 returned error can't find the container with id 56b247c57a86fe1161b1e83d73c60c39c6f5eb2eddcd3122fa374e2b07e3e562 Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.512046 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 14:02:48 crc kubenswrapper[4911]: set -euo pipefail Mar 10 14:02:48 crc kubenswrapper[4911]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 14:02:48 crc kubenswrapper[4911]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 14:02:48 crc kubenswrapper[4911]: # As the secret mount is optional we must wait for the files to be present. Mar 10 14:02:48 crc kubenswrapper[4911]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 14:02:48 crc kubenswrapper[4911]: TS=$(date +%s) Mar 10 14:02:48 crc kubenswrapper[4911]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 14:02:48 crc kubenswrapper[4911]: HAS_LOGGED_INFO=0 Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: log_missing_certs(){ Mar 10 14:02:48 crc kubenswrapper[4911]: CUR_TS=$(date +%s) Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 14:02:48 crc kubenswrapper[4911]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 14:02:48 crc kubenswrapper[4911]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 14:02:48 crc kubenswrapper[4911]: HAS_LOGGED_INFO=1 Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: } Mar 10 14:02:48 crc kubenswrapper[4911]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 14:02:48 crc kubenswrapper[4911]: log_missing_certs Mar 10 14:02:48 crc kubenswrapper[4911]: sleep 5 Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/kube-rbac-proxy \ Mar 10 14:02:48 crc kubenswrapper[4911]: --logtostderr \ Mar 10 14:02:48 crc kubenswrapper[4911]: --secure-listen-address=:9108 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --upstream=http://127.0.0.1:29108/ \ Mar 10 14:02:48 crc kubenswrapper[4911]: --tls-private-key-file=${TLS_PK} \ Mar 10 14:02:48 crc kubenswrapper[4911]: --tls-cert-file=${TLS_CERT} Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-xxbj8_openshift-ovn-kubernetes(7cae19a2-5349-4f13-94ab-bfe066e4589a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.512151 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: set -o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:02:48 crc kubenswrapper[4911]: set +o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --disable-webhook \ Mar 10 14:02:48 crc kubenswrapper[4911]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --loglevel="${LOGLEVEL}" Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.513247 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.514113 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: set -o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:02:48 crc kubenswrapper[4911]: set +o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v4_join_subnet_opt= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v6_join_subnet_opt= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v4_transit_switch_subnet_opt= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v6_transit_switch_subnet_opt= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: dns_name_resolver_enabled_flag= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "false" == "true" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: persistent_ips_enabled_flag= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "true" == "true" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: # This is needed so that converting clusters from GA to TP Mar 10 14:02:48 crc kubenswrapper[4911]: # will rollout control plane pods as well Mar 10 14:02:48 crc kubenswrapper[4911]: network_segmentation_enabled_flag= Mar 10 14:02:48 crc kubenswrapper[4911]: multi_network_enabled_flag= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "true" == "true" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: multi_network_enabled_flag="--enable-multi-network" Mar 10 14:02:48 crc kubenswrapper[4911]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/ovnkube \ Mar 10 14:02:48 crc kubenswrapper[4911]: --enable-interconnect \ Mar 10 14:02:48 crc kubenswrapper[4911]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 14:02:48 crc kubenswrapper[4911]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --metrics-enable-pprof \ Mar 10 14:02:48 crc kubenswrapper[4911]: --metrics-enable-config-duration \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ovn_v4_join_subnet_opt} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ovn_v6_join_subnet_opt} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${dns_name_resolver_enabled_flag} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${persistent_ips_enabled_flag} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${multi_network_enabled_flag} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${network_segmentation_enabled_flag} Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-xxbj8_openshift-ovn-kubernetes(7cae19a2-5349-4f13-94ab-bfe066e4589a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.515426 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" podUID="7cae19a2-5349-4f13-94ab-bfe066e4589a" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.519454 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:02:48 crc kubenswrapper[4911]: W0310 14:02:48.520278 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-16d82bf87efb93126f80b26d3921bcbef2bbfe955e394a05ca365b8f53f4af15 WatchSource:0}: Error finding container 16d82bf87efb93126f80b26d3921bcbef2bbfe955e394a05ca365b8f53f4af15: Status 404 returned error can't find the container with id 16d82bf87efb93126f80b26d3921bcbef2bbfe955e394a05ca365b8f53f4af15 Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.526605 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.527855 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.530670 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vfj7m" Mar 10 14:02:48 crc kubenswrapper[4911]: W0310 14:02:48.535598 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2970cff_e2bc_40e6_9d80_7388d88e840e.slice/crio-abe2c76b8b957005db86ff1b700f841f0b0fbc54a7627f93991c098be87cbb22 WatchSource:0}: Error finding container abe2c76b8b957005db86ff1b700f841f0b0fbc54a7627f93991c098be87cbb22: Status 404 returned error can't find the container with id abe2c76b8b957005db86ff1b700f841f0b0fbc54a7627f93991c098be87cbb22 Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.536477 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"16d82bf87efb93126f80b26d3921bcbef2bbfe955e394a05ca365b8f53f4af15"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.540594 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9p6ll" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.540794 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.541368 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgrqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.541570 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" event={"ID":"7cae19a2-5349-4f13-94ab-bfe066e4589a","Type":"ContainerStarted","Data":"56b247c57a86fe1161b1e83d73c60c39c6f5eb2eddcd3122fa374e2b07e3e562"} Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.541970 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.543759 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgrqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.543839 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.543871 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.543882 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.543899 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.543910 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.543860 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5708298891d41808a183d4bc0a693c22f472ace5c3a62024ff01ca815cd648b6"} Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.544843 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.545111 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4fbb1f39fb7ec1aa274eb64b5fb5e4458f64a1536ce22dee64cd93d2bc6fc59e"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.550298 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.551135 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.553115 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: set -o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:02:48 crc kubenswrapper[4911]: set +o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 14:02:48 crc kubenswrapper[4911]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 14:02:48 crc kubenswrapper[4911]: ho_enable="--enable-hybrid-overlay" Mar 10 14:02:48 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 14:02:48 crc kubenswrapper[4911]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 14:02:48 crc kubenswrapper[4911]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --webhook-host=127.0.0.1 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --webhook-port=9743 \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ho_enable} \ Mar 10 14:02:48 crc kubenswrapper[4911]: --enable-interconnect \ Mar 10 14:02:48 crc kubenswrapper[4911]: --disable-approver \ Mar 10 14:02:48 crc kubenswrapper[4911]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --wait-for-kubernetes-api=200s \ Mar 10 14:02:48 crc kubenswrapper[4911]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --loglevel="${LOGLEVEL}" Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.553326 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 14:02:48 crc kubenswrapper[4911]: set -euo pipefail Mar 10 14:02:48 crc kubenswrapper[4911]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 14:02:48 crc kubenswrapper[4911]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 14:02:48 crc kubenswrapper[4911]: # As the secret mount is optional we must wait for the files to be present. Mar 10 14:02:48 crc kubenswrapper[4911]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 14:02:48 crc kubenswrapper[4911]: TS=$(date +%s) Mar 10 14:02:48 crc kubenswrapper[4911]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 14:02:48 crc kubenswrapper[4911]: HAS_LOGGED_INFO=0 Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: log_missing_certs(){ Mar 10 14:02:48 crc kubenswrapper[4911]: CUR_TS=$(date +%s) Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 14:02:48 crc kubenswrapper[4911]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 14:02:48 crc kubenswrapper[4911]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 14:02:48 crc kubenswrapper[4911]: HAS_LOGGED_INFO=1 Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: } Mar 10 14:02:48 crc kubenswrapper[4911]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 14:02:48 crc kubenswrapper[4911]: log_missing_certs Mar 10 14:02:48 crc kubenswrapper[4911]: sleep 5 Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/kube-rbac-proxy \ Mar 10 14:02:48 crc kubenswrapper[4911]: --logtostderr \ Mar 10 14:02:48 crc kubenswrapper[4911]: --secure-listen-address=:9108 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --upstream=http://127.0.0.1:29108/ \ Mar 10 14:02:48 crc kubenswrapper[4911]: --tls-private-key-file=${TLS_PK} \ Mar 10 14:02:48 crc kubenswrapper[4911]: --tls-cert-file=${TLS_CERT} Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-xxbj8_openshift-ovn-kubernetes(7cae19a2-5349-4f13-94ab-bfe066e4589a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.553519 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 14:02:48 crc kubenswrapper[4911]: set -o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: source /etc/kubernetes/apiserver-url.env Mar 10 14:02:48 crc kubenswrapper[4911]: else Mar 10 14:02:48 crc kubenswrapper[4911]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 14:02:48 crc kubenswrapper[4911]: exit 1 Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.555432 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.557597 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: set -o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:02:48 crc kubenswrapper[4911]: set +o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 14:02:48 crc kubenswrapper[4911]: --disable-webhook \ Mar 10 14:02:48 crc kubenswrapper[4911]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --loglevel="${LOGLEVEL}" Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.557786 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 14:02:48 crc kubenswrapper[4911]: while [ true ]; Mar 10 14:02:48 crc kubenswrapper[4911]: do Mar 10 14:02:48 crc kubenswrapper[4911]: for f in $(ls /tmp/serviceca); do Mar 10 14:02:48 crc kubenswrapper[4911]: echo $f Mar 10 14:02:48 crc kubenswrapper[4911]: ca_file_path="/tmp/serviceca/${f}" Mar 10 14:02:48 crc kubenswrapper[4911]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 14:02:48 crc kubenswrapper[4911]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 14:02:48 crc kubenswrapper[4911]: if [ -e "${reg_dir_path}" ]; then Mar 10 14:02:48 crc kubenswrapper[4911]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 14:02:48 crc kubenswrapper[4911]: else Mar 10 14:02:48 crc kubenswrapper[4911]: mkdir $reg_dir_path Mar 10 14:02:48 crc kubenswrapper[4911]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: for d in $(ls /etc/docker/certs.d); do Mar 10 14:02:48 crc kubenswrapper[4911]: echo $d Mar 10 14:02:48 crc kubenswrapper[4911]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 14:02:48 crc kubenswrapper[4911]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 14:02:48 crc kubenswrapper[4911]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 14:02:48 crc kubenswrapper[4911]: rm -rf /etc/docker/certs.d/$d Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: sleep 60 & wait ${!} Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dplb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vfj7m_openshift-image-registry(acf8e218-4a2a-4d62-9aa8-7ecca1109d35): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.557960 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: set -o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:02:48 crc kubenswrapper[4911]: set +o allexport Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v4_join_subnet_opt= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v6_join_subnet_opt= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v4_transit_switch_subnet_opt= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v6_transit_switch_subnet_opt= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: dns_name_resolver_enabled_flag= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "false" == "true" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: persistent_ips_enabled_flag= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "true" == "true" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: # This is needed so that converting clusters from GA to TP Mar 10 14:02:48 crc kubenswrapper[4911]: # will rollout control plane pods as well Mar 10 14:02:48 crc kubenswrapper[4911]: network_segmentation_enabled_flag= Mar 10 14:02:48 crc kubenswrapper[4911]: multi_network_enabled_flag= Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "true" == "true" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: multi_network_enabled_flag="--enable-multi-network" Mar 10 14:02:48 crc kubenswrapper[4911]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 14:02:48 crc kubenswrapper[4911]: exec /usr/bin/ovnkube \ Mar 10 14:02:48 crc kubenswrapper[4911]: --enable-interconnect \ Mar 10 14:02:48 crc kubenswrapper[4911]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 14:02:48 crc kubenswrapper[4911]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 14:02:48 crc kubenswrapper[4911]: --metrics-enable-pprof \ Mar 10 14:02:48 crc kubenswrapper[4911]: --metrics-enable-config-duration \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ovn_v4_join_subnet_opt} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ovn_v6_join_subnet_opt} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${dns_name_resolver_enabled_flag} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${persistent_ips_enabled_flag} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${multi_network_enabled_flag} \ Mar 10 14:02:48 crc kubenswrapper[4911]: ${network_segmentation_enabled_flag} Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-xxbj8_openshift-ovn-kubernetes(7cae19a2-5349-4f13-94ab-bfe066e4589a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.558779 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.558843 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vfj7m" podUID="acf8e218-4a2a-4d62-9aa8-7ecca1109d35" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.559193 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" podUID="7cae19a2-5349-4f13-94ab-bfe066e4589a" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.561516 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z255c" Mar 10 14:02:48 crc kubenswrapper[4911]: W0310 14:02:48.563867 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0fdaa42_a77f_4f62_b94f_6659225e12af.slice/crio-57c087d8b95706a9b2e0dac6f1f680b3250cfae897655488af8d99f19199a6d4 WatchSource:0}: Error finding container 57c087d8b95706a9b2e0dac6f1f680b3250cfae897655488af8d99f19199a6d4: Status 404 returned error can't find the container with id 57c087d8b95706a9b2e0dac6f1f680b3250cfae897655488af8d99f19199a6d4 Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.566423 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.569901 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nsxjn" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.570829 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 14:02:48 crc kubenswrapper[4911]: set -uo pipefail Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 14:02:48 crc kubenswrapper[4911]: HOSTS_FILE="/etc/hosts" Mar 10 14:02:48 crc kubenswrapper[4911]: TEMP_FILE="/etc/hosts.tmp" Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: # Make a temporary file with the old hosts file's attributes. Mar 10 14:02:48 crc kubenswrapper[4911]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 14:02:48 crc kubenswrapper[4911]: echo "Failed to preserve hosts file. Exiting." Mar 10 14:02:48 crc kubenswrapper[4911]: exit 1 Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: while true; do Mar 10 14:02:48 crc kubenswrapper[4911]: declare -A svc_ips Mar 10 14:02:48 crc kubenswrapper[4911]: for svc in "${services[@]}"; do Mar 10 14:02:48 crc kubenswrapper[4911]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 14:02:48 crc kubenswrapper[4911]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 14:02:48 crc kubenswrapper[4911]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 14:02:48 crc kubenswrapper[4911]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 14:02:48 crc kubenswrapper[4911]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:02:48 crc kubenswrapper[4911]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:02:48 crc kubenswrapper[4911]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:02:48 crc kubenswrapper[4911]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 14:02:48 crc kubenswrapper[4911]: for i in ${!cmds[*]} Mar 10 14:02:48 crc kubenswrapper[4911]: do Mar 10 14:02:48 crc kubenswrapper[4911]: ips=($(eval "${cmds[i]}")) Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: svc_ips["${svc}"]="${ips[@]}" Mar 10 14:02:48 crc kubenswrapper[4911]: break Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: # Update /etc/hosts only if we get valid service IPs Mar 10 14:02:48 crc kubenswrapper[4911]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 14:02:48 crc kubenswrapper[4911]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 14:02:48 crc kubenswrapper[4911]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 14:02:48 crc kubenswrapper[4911]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 14:02:48 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:02:48 crc kubenswrapper[4911]: continue Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: # Append resolver entries for services Mar 10 14:02:48 crc kubenswrapper[4911]: rc=0 Mar 10 14:02:48 crc kubenswrapper[4911]: for svc in "${!svc_ips[@]}"; do Mar 10 14:02:48 crc kubenswrapper[4911]: for ip in ${svc_ips[${svc}]}; do Mar 10 14:02:48 crc kubenswrapper[4911]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: if [[ $rc -ne 0 ]]; then Mar 10 14:02:48 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:02:48 crc kubenswrapper[4911]: continue Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: Mar 10 14:02:48 crc kubenswrapper[4911]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 14:02:48 crc kubenswrapper[4911]: # Replace /etc/hosts with our modified version if needed Mar 10 14:02:48 crc kubenswrapper[4911]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 14:02:48 crc kubenswrapper[4911]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 14:02:48 crc kubenswrapper[4911]: fi Mar 10 14:02:48 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:02:48 crc kubenswrapper[4911]: unset svc_ips Mar 10 14:02:48 crc kubenswrapper[4911]: done Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-87sf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9p6ll_openshift-dns(a0fdaa42-a77f-4f62-b94f-6659225e12af): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.571406 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 14:02:48 crc kubenswrapper[4911]: apiVersion: v1 Mar 10 14:02:48 crc kubenswrapper[4911]: clusters: Mar 10 14:02:48 crc kubenswrapper[4911]: - cluster: Mar 10 14:02:48 crc kubenswrapper[4911]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 14:02:48 crc kubenswrapper[4911]: server: https://api-int.crc.testing:6443 Mar 10 14:02:48 crc kubenswrapper[4911]: name: default-cluster Mar 10 14:02:48 crc kubenswrapper[4911]: contexts: Mar 10 14:02:48 crc kubenswrapper[4911]: - context: Mar 10 14:02:48 crc kubenswrapper[4911]: cluster: default-cluster Mar 10 14:02:48 crc kubenswrapper[4911]: namespace: default Mar 10 14:02:48 crc kubenswrapper[4911]: user: default-auth Mar 10 14:02:48 crc kubenswrapper[4911]: name: default-context Mar 10 14:02:48 crc kubenswrapper[4911]: current-context: default-context Mar 10 14:02:48 crc kubenswrapper[4911]: kind: Config Mar 10 14:02:48 crc kubenswrapper[4911]: preferences: {} Mar 10 14:02:48 crc kubenswrapper[4911]: users: Mar 10 14:02:48 crc kubenswrapper[4911]: - name: default-auth Mar 10 14:02:48 crc kubenswrapper[4911]: user: Mar 10 14:02:48 crc kubenswrapper[4911]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 14:02:48 crc kubenswrapper[4911]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 14:02:48 crc kubenswrapper[4911]: EOF Mar 10 14:02:48 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rh7d8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.572761 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9p6ll" podUID="a0fdaa42-a77f-4f62-b94f-6659225e12af" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.572830 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.578001 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.590453 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjz78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-z255c_openshift-multus(349ee2ee-803a-404b-9aa2-2230eabdbb56): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.591046 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.591676 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-z255c" podUID="349ee2ee-803a-404b-9aa2-2230eabdbb56" Mar 10 14:02:48 crc kubenswrapper[4911]: W0310 14:02:48.592340 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc662696_d402_4969_bebd_00fa42e63075.slice/crio-59b810a4ec5f35d2f8f86f5040640486088df104605f36c538a3093540c1ed2d WatchSource:0}: Error finding container 59b810a4ec5f35d2f8f86f5040640486088df104605f36c538a3093540c1ed2d: Status 404 returned error can't find the container with id 59b810a4ec5f35d2f8f86f5040640486088df104605f36c538a3093540c1ed2d Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.595898 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:48 crc kubenswrapper[4911]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 14:02:48 crc kubenswrapper[4911]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 14:02:48 crc kubenswrapper[4911]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxrrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-nsxjn_openshift-multus(fc662696-d402-4969-bebd-00fa42e63075): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:48 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.597991 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-nsxjn" podUID="fc662696-d402-4969-bebd-00fa42e63075" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.602628 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.617113 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.633244 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.643592 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.646629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.646680 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.646692 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.646710 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.646737 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.661341 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.678014 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.686089 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.703042 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.717624 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.728031 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.738082 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.747112 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.750889 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.750923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.750934 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.750955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.750968 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.755900 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.755941 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.755970 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.755993 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756120 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756137 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756151 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756195 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:49.756178571 +0000 UTC m=+74.319698488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756424 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756468 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:49.756456658 +0000 UTC m=+74.319976565 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756589 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756741 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:49.756693075 +0000 UTC m=+74.320213002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756601 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756778 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756799 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.756851 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:49.756837628 +0000 UTC m=+74.320357555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.758689 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.772159 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.786956 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.797754 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.808533 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.820399 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.831181 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.840999 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.853100 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.853521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.853563 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.853577 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.853602 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.853618 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.869694 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.889302 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.940972 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.956265 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.956333 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.956343 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.956362 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.956378 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:48Z","lastTransitionTime":"2026-03-10T14:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.957667 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:48 crc kubenswrapper[4911]: I0310 14:02:48.957878 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.957997 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:02:49.957952459 +0000 UTC m=+74.521472416 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.958071 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:48 crc kubenswrapper[4911]: E0310 14:02:48.958194 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:49.958158755 +0000 UTC m=+74.521678852 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.059508 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.059584 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.059605 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.059631 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.059651 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.162626 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.162703 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.162760 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.162828 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.162849 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.192805 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.192978 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.265881 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.265931 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.265942 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.265961 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.265971 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.368680 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.368817 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.368841 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.368875 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.368901 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.472365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.472408 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.472424 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.472448 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.472462 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.549763 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"91bcd441701cce09d4b69201e187d85ba797da190cfb8f2d6e579f1921869e72"} Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.551828 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:49 crc kubenswrapper[4911]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 14:02:49 crc kubenswrapper[4911]: apiVersion: v1 Mar 10 14:02:49 crc kubenswrapper[4911]: clusters: Mar 10 14:02:49 crc kubenswrapper[4911]: - cluster: Mar 10 14:02:49 crc kubenswrapper[4911]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 14:02:49 crc kubenswrapper[4911]: server: https://api-int.crc.testing:6443 Mar 10 14:02:49 crc kubenswrapper[4911]: name: default-cluster Mar 10 14:02:49 crc kubenswrapper[4911]: contexts: Mar 10 14:02:49 crc kubenswrapper[4911]: - context: Mar 10 14:02:49 crc kubenswrapper[4911]: cluster: default-cluster Mar 10 14:02:49 crc kubenswrapper[4911]: namespace: default Mar 10 14:02:49 crc kubenswrapper[4911]: user: default-auth Mar 10 14:02:49 crc kubenswrapper[4911]: name: default-context Mar 10 14:02:49 crc kubenswrapper[4911]: current-context: default-context Mar 10 14:02:49 crc kubenswrapper[4911]: kind: Config Mar 10 14:02:49 crc kubenswrapper[4911]: preferences: {} Mar 10 14:02:49 crc kubenswrapper[4911]: users: Mar 10 14:02:49 crc kubenswrapper[4911]: - name: default-auth Mar 10 14:02:49 crc kubenswrapper[4911]: user: Mar 10 14:02:49 crc kubenswrapper[4911]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 14:02:49 crc kubenswrapper[4911]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 14:02:49 crc kubenswrapper[4911]: EOF Mar 10 14:02:49 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rh7d8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:49 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.552035 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9p6ll" event={"ID":"a0fdaa42-a77f-4f62-b94f-6659225e12af","Type":"ContainerStarted","Data":"57c087d8b95706a9b2e0dac6f1f680b3250cfae897655488af8d99f19199a6d4"} Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.552920 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.553579 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:49 crc kubenswrapper[4911]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 14:02:49 crc kubenswrapper[4911]: set -uo pipefail Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 14:02:49 crc kubenswrapper[4911]: HOSTS_FILE="/etc/hosts" Mar 10 14:02:49 crc kubenswrapper[4911]: TEMP_FILE="/etc/hosts.tmp" Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: # Make a temporary file with the old hosts file's attributes. Mar 10 14:02:49 crc kubenswrapper[4911]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 14:02:49 crc kubenswrapper[4911]: echo "Failed to preserve hosts file. Exiting." Mar 10 14:02:49 crc kubenswrapper[4911]: exit 1 Mar 10 14:02:49 crc kubenswrapper[4911]: fi Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: while true; do Mar 10 14:02:49 crc kubenswrapper[4911]: declare -A svc_ips Mar 10 14:02:49 crc kubenswrapper[4911]: for svc in "${services[@]}"; do Mar 10 14:02:49 crc kubenswrapper[4911]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 14:02:49 crc kubenswrapper[4911]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 14:02:49 crc kubenswrapper[4911]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 14:02:49 crc kubenswrapper[4911]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 14:02:49 crc kubenswrapper[4911]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:02:49 crc kubenswrapper[4911]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:02:49 crc kubenswrapper[4911]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:02:49 crc kubenswrapper[4911]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 14:02:49 crc kubenswrapper[4911]: for i in ${!cmds[*]} Mar 10 14:02:49 crc kubenswrapper[4911]: do Mar 10 14:02:49 crc kubenswrapper[4911]: ips=($(eval "${cmds[i]}")) Mar 10 14:02:49 crc kubenswrapper[4911]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 14:02:49 crc kubenswrapper[4911]: svc_ips["${svc}"]="${ips[@]}" Mar 10 14:02:49 crc kubenswrapper[4911]: break Mar 10 14:02:49 crc kubenswrapper[4911]: fi Mar 10 14:02:49 crc kubenswrapper[4911]: done Mar 10 14:02:49 crc kubenswrapper[4911]: done Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: # Update /etc/hosts only if we get valid service IPs Mar 10 14:02:49 crc kubenswrapper[4911]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 14:02:49 crc kubenswrapper[4911]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 14:02:49 crc kubenswrapper[4911]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 14:02:49 crc kubenswrapper[4911]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 14:02:49 crc kubenswrapper[4911]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 14:02:49 crc kubenswrapper[4911]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 14:02:49 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:02:49 crc kubenswrapper[4911]: continue Mar 10 14:02:49 crc kubenswrapper[4911]: fi Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: # Append resolver entries for services Mar 10 14:02:49 crc kubenswrapper[4911]: rc=0 Mar 10 14:02:49 crc kubenswrapper[4911]: for svc in "${!svc_ips[@]}"; do Mar 10 14:02:49 crc kubenswrapper[4911]: for ip in ${svc_ips[${svc}]}; do Mar 10 14:02:49 crc kubenswrapper[4911]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 14:02:49 crc kubenswrapper[4911]: done Mar 10 14:02:49 crc kubenswrapper[4911]: done Mar 10 14:02:49 crc kubenswrapper[4911]: if [[ $rc -ne 0 ]]; then Mar 10 14:02:49 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:02:49 crc kubenswrapper[4911]: continue Mar 10 14:02:49 crc kubenswrapper[4911]: fi Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: Mar 10 14:02:49 crc kubenswrapper[4911]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 14:02:49 crc kubenswrapper[4911]: # Replace /etc/hosts with our modified version if needed Mar 10 14:02:49 crc kubenswrapper[4911]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 14:02:49 crc kubenswrapper[4911]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 14:02:49 crc kubenswrapper[4911]: fi Mar 10 14:02:49 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:02:49 crc kubenswrapper[4911]: unset svc_ips Mar 10 14:02:49 crc kubenswrapper[4911]: done Mar 10 14:02:49 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-87sf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9p6ll_openshift-dns(a0fdaa42-a77f-4f62-b94f-6659225e12af): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:49 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.554381 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nsxjn" event={"ID":"fc662696-d402-4969-bebd-00fa42e63075","Type":"ContainerStarted","Data":"59b810a4ec5f35d2f8f86f5040640486088df104605f36c538a3093540c1ed2d"} Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.554642 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9p6ll" podUID="a0fdaa42-a77f-4f62-b94f-6659225e12af" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.555458 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerStarted","Data":"e9f816390b45fc4f4acb79c1a29d710d61c8753cd7e6e07fd350f9349e83bf20"} Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.556886 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:49 crc kubenswrapper[4911]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 14:02:49 crc kubenswrapper[4911]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 14:02:49 crc kubenswrapper[4911]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxrrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-nsxjn_openshift-multus(fc662696-d402-4969-bebd-00fa42e63075): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:49 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.557064 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"abe2c76b8b957005db86ff1b700f841f0b0fbc54a7627f93991c098be87cbb22"} Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.557939 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjz78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-z255c_openshift-multus(349ee2ee-803a-404b-9aa2-2230eabdbb56): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.558027 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-nsxjn" podUID="fc662696-d402-4969-bebd-00fa42e63075" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.558160 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vfj7m" event={"ID":"acf8e218-4a2a-4d62-9aa8-7ecca1109d35","Type":"ContainerStarted","Data":"c1716b3e14fca017184595af53a5327ec2a872604ff59b9d097a0fadf4c22237"} Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.558259 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgrqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.559044 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-z255c" podUID="349ee2ee-803a-404b-9aa2-2230eabdbb56" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.560321 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:02:49 crc kubenswrapper[4911]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 14:02:49 crc kubenswrapper[4911]: while [ true ]; Mar 10 14:02:49 crc kubenswrapper[4911]: do Mar 10 14:02:49 crc kubenswrapper[4911]: for f in $(ls /tmp/serviceca); do Mar 10 14:02:49 crc kubenswrapper[4911]: echo $f Mar 10 14:02:49 crc kubenswrapper[4911]: ca_file_path="/tmp/serviceca/${f}" Mar 10 14:02:49 crc kubenswrapper[4911]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 14:02:49 crc kubenswrapper[4911]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 14:02:49 crc kubenswrapper[4911]: if [ -e "${reg_dir_path}" ]; then Mar 10 14:02:49 crc kubenswrapper[4911]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 14:02:49 crc kubenswrapper[4911]: else Mar 10 14:02:49 crc kubenswrapper[4911]: mkdir $reg_dir_path Mar 10 14:02:49 crc kubenswrapper[4911]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 14:02:49 crc kubenswrapper[4911]: fi Mar 10 14:02:49 crc kubenswrapper[4911]: done Mar 10 14:02:49 crc kubenswrapper[4911]: for d in $(ls /etc/docker/certs.d); do Mar 10 14:02:49 crc kubenswrapper[4911]: echo $d Mar 10 14:02:49 crc kubenswrapper[4911]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 14:02:49 crc kubenswrapper[4911]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 14:02:49 crc kubenswrapper[4911]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 14:02:49 crc kubenswrapper[4911]: rm -rf /etc/docker/certs.d/$d Mar 10 14:02:49 crc kubenswrapper[4911]: fi Mar 10 14:02:49 crc kubenswrapper[4911]: done Mar 10 14:02:49 crc kubenswrapper[4911]: sleep 60 & wait ${!} Mar 10 14:02:49 crc kubenswrapper[4911]: done Mar 10 14:02:49 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dplb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vfj7m_openshift-image-registry(acf8e218-4a2a-4d62-9aa8-7ecca1109d35): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:02:49 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.561032 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgrqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.561582 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vfj7m" podUID="acf8e218-4a2a-4d62-9aa8-7ecca1109d35" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.562707 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.564213 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.575611 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.575644 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.575653 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.575668 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.575677 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.577638 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.589065 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.604237 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.614866 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.627699 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.644991 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.656908 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.677379 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.678456 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.678507 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.678524 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.678545 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.678558 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.691681 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.703262 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.724770 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.737149 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.751355 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.763035 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.767101 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.767178 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.767217 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767245 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767331 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:51.767311782 +0000 UTC m=+76.330831709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.767248 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767370 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767460 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767479 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767511 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767531 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:51.767519218 +0000 UTC m=+76.331039145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767544 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767534 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767565 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767697 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:51.767661502 +0000 UTC m=+76.331181449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.767788 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:51.767761354 +0000 UTC m=+76.331281491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.773474 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.784326 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.784390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.784420 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.784458 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.784489 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.786284 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.798589 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.807582 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.820857 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.830108 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.840356 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.850467 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.888619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.888667 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.888682 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.888704 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.888716 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.889693 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.936740 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.970074 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.970274 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.970410 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.970452 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:51.970439386 +0000 UTC m=+76.533959303 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:49 crc kubenswrapper[4911]: E0310 14:02:49.970765 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:02:51.970756825 +0000 UTC m=+76.534276742 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.976667 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.992521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.992599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.992620 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.992652 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:49 crc kubenswrapper[4911]: I0310 14:02:49.992676 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:49Z","lastTransitionTime":"2026-03-10T14:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.008415 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.071034 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.097265 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.097345 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.097364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.097391 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.097412 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.192935 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.193015 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.192971 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:50 crc kubenswrapper[4911]: E0310 14:02:50.193141 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:02:50 crc kubenswrapper[4911]: E0310 14:02:50.193445 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:02:50 crc kubenswrapper[4911]: E0310 14:02:50.193603 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.200411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.200461 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.200479 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.200503 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.200521 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.201190 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.202356 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.204959 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.206610 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.208855 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.210253 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.211774 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.214025 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.215487 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.218084 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.219470 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.222666 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.223916 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.225237 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.227254 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.228448 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.230560 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.231466 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.232800 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.234910 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.236056 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.238222 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.239292 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.241896 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.242969 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.244379 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.246975 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.248124 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.250529 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.251521 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.253794 4911 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.254082 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.258216 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.260297 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.261892 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.265643 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.267566 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.269693 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.271095 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.273664 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.274877 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.277790 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.279189 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.281964 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.283133 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.284690 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.285546 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.287402 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.288141 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.289357 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.290041 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.291415 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.293119 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.293998 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.303890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.303925 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.303937 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.303953 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.303966 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.407324 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.407413 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.407438 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.407471 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.407495 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.510085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.510144 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.510162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.510185 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.510201 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.613101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.613172 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.613190 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.613216 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.613232 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.716375 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.716446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.716467 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.716493 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.716512 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.821026 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.821130 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.821153 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.821189 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.821216 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.924925 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.925013 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.925033 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.925061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:50 crc kubenswrapper[4911]: I0310 14:02:50.925080 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:50Z","lastTransitionTime":"2026-03-10T14:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.027702 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.027761 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.027772 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.027789 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.027800 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.130413 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.130467 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.130485 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.130510 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.130527 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.193408 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.193608 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.232909 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.232978 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.232996 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.233023 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.233043 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.336050 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.336087 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.336097 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.336111 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.336122 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.438932 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.438976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.438988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.439007 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.439021 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.541879 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.541919 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.541949 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.541968 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.541979 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.644270 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.644368 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.644388 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.644440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.644459 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.747327 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.747398 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.747415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.747445 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.747463 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.794016 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.794080 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.794144 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.794190 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.794225 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.794265 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.794377 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.794397 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:55.794350451 +0000 UTC m=+80.357870398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.794403 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.794433 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.794444 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:55.794422783 +0000 UTC m=+80.357943020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.794497 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:55.794474314 +0000 UTC m=+80.357994271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.795064 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.795111 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.795133 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.795201 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:55.795176103 +0000 UTC m=+80.358696220 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.851407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.851474 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.851487 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.851505 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.851851 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.956137 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.956210 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.956231 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.956261 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.956282 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:51Z","lastTransitionTime":"2026-03-10T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.996208 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:51 crc kubenswrapper[4911]: I0310 14:02:51.996356 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.996470 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:02:55.996432107 +0000 UTC m=+80.559952064 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.996554 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:51 crc kubenswrapper[4911]: E0310 14:02:51.996623 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:02:55.996606032 +0000 UTC m=+80.560125989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.060569 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.060628 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.060640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.060664 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.060678 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.163966 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.164009 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.164020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.164034 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.164045 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.192954 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:52 crc kubenswrapper[4911]: E0310 14:02:52.193132 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.193430 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.193647 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:52 crc kubenswrapper[4911]: E0310 14:02:52.193830 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:02:52 crc kubenswrapper[4911]: E0310 14:02:52.193989 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.267862 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.267951 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.267976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.268016 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.268046 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.371824 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.371930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.371950 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.371978 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.372000 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.475694 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.476089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.476214 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.476358 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.476494 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.578928 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.578992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.579010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.579037 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.579058 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.682464 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.682554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.682580 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.682607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.682624 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.785409 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.785487 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.785514 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.785550 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.785569 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.888968 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.889053 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.889077 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.889110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.889133 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.994114 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.994214 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.994242 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.994280 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:52 crc kubenswrapper[4911]: I0310 14:02:52.994319 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:52Z","lastTransitionTime":"2026-03-10T14:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.098699 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.098817 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.098843 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.098876 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.098901 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.193251 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:53 crc kubenswrapper[4911]: E0310 14:02:53.193512 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.202570 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.202676 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.202696 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.202748 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.202767 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.305640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.305713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.305759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.305787 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.305808 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.409011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.409132 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.409204 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.409244 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.409287 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.512963 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.513020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.513030 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.513047 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.513060 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.616466 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.616536 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.616558 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.616590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.616612 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.719418 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.719469 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.719480 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.719497 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.719509 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.821449 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.821879 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.821974 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.822059 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.822171 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.925574 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.926198 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.926384 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.926562 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:53 crc kubenswrapper[4911]: I0310 14:02:53.926785 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:53Z","lastTransitionTime":"2026-03-10T14:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.030547 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.030981 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.031210 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.031369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.031488 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.135331 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.135719 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.135840 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.135923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.135994 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.192953 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:54 crc kubenswrapper[4911]: E0310 14:02:54.193212 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.193007 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:54 crc kubenswrapper[4911]: E0310 14:02:54.193462 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.193947 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:54 crc kubenswrapper[4911]: E0310 14:02:54.194127 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.238935 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.239014 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.239028 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.239058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.239072 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.343055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.343615 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.343940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.344187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.344335 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.448238 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.448948 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.448996 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.449071 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.449097 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.553887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.554566 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.554605 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.554643 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.554667 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.657491 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.657597 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.657626 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.657674 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.657701 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.761704 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.761831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.761858 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.761894 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.761918 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.865127 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.865184 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.865196 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.865218 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.865232 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.968045 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.968123 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.968140 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.968166 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:54 crc kubenswrapper[4911]: I0310 14:02:54.968184 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:54Z","lastTransitionTime":"2026-03-10T14:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.071786 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.071871 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.071894 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.071921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.071940 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.174823 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.174910 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.174938 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.174984 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.175009 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.193221 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.193412 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.277892 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.277947 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.277972 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.278000 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.278022 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.381862 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.381938 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.381958 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.381989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.382005 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.485037 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.485103 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.485169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.485203 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.485221 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.587454 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.587550 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.587578 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.587637 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.587660 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.691553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.691633 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.691653 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.691680 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.691699 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.795506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.795585 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.795604 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.795632 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.795652 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.843813 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.843875 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.843929 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.843954 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844084 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844089 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844129 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844145 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844156 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:03.844133211 +0000 UTC m=+88.407653138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844215 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:03.844194592 +0000 UTC m=+88.407714519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844227 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844254 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844312 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844335 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844377 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:03.844337576 +0000 UTC m=+88.407857533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:02:55 crc kubenswrapper[4911]: E0310 14:02:55.844450 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:03.844409578 +0000 UTC m=+88.407929675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.899762 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.899857 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.899882 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.899905 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:55 crc kubenswrapper[4911]: I0310 14:02:55.899948 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:55Z","lastTransitionTime":"2026-03-10T14:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.003479 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.003583 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.003601 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.003621 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.003632 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.046533 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.046807 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:03:04.046782532 +0000 UTC m=+88.610302459 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.046921 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.047046 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.047105 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:04.04709236 +0000 UTC m=+88.610612287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.084882 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.084950 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.084972 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.085003 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.085022 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.109605 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.116136 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.116192 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.116210 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.116235 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.116254 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.129712 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.135524 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.135587 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.135607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.135633 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.135652 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.152168 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.157319 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.157383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.157402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.157427 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.157446 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.173942 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.179123 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.179177 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.179196 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.179221 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.179240 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.192517 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.192562 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.192604 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.192694 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.192871 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.192994 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.195770 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: E0310 14:02:56.195913 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.197343 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.197365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.197378 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.197393 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.197406 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.208972 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.224999 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.241372 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.253045 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.265200 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.283702 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.298506 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.299747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.299774 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.299785 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.299800 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.299811 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.308760 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.322187 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.331019 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.350352 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.360587 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.369196 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.376889 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.401527 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.401579 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.401597 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.401619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.401638 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.504097 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.504161 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.504181 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.504203 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.504223 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.606929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.606996 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.607016 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.607041 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.607062 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.711351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.711451 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.711472 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.711506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.711574 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.814326 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.814390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.814408 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.814430 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.814449 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.917949 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.918060 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.918082 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.918141 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:56 crc kubenswrapper[4911]: I0310 14:02:56.918159 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:56Z","lastTransitionTime":"2026-03-10T14:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.021435 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.021480 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.021504 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.021533 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.021545 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.125194 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.125252 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.125269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.125292 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.125310 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.192536 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:57 crc kubenswrapper[4911]: E0310 14:02:57.192763 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.208118 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.208253 4911 scope.go:117] "RemoveContainer" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" Mar 10 14:02:57 crc kubenswrapper[4911]: E0310 14:02:57.208515 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.229368 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.229454 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.229475 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.229498 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.229553 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.332471 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.332522 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.332549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.332576 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.332598 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.435432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.435522 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.435547 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.435579 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.435606 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.539014 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.539096 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.539120 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.539152 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.539174 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.584632 4911 scope.go:117] "RemoveContainer" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" Mar 10 14:02:57 crc kubenswrapper[4911]: E0310 14:02:57.584938 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.642477 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.642590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.642619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.642653 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.642679 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.745989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.746088 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.746109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.746140 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.746169 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.848989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.849087 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.849107 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.849136 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.849157 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.952959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.953033 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.953052 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.953073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:57 crc kubenswrapper[4911]: I0310 14:02:57.953091 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:57Z","lastTransitionTime":"2026-03-10T14:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.056213 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.056287 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.056316 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.056351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.056374 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.159354 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.159816 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.159991 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.160125 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.160302 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.192414 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.192414 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.192585 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:02:58 crc kubenswrapper[4911]: E0310 14:02:58.192808 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:02:58 crc kubenswrapper[4911]: E0310 14:02:58.192965 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:02:58 crc kubenswrapper[4911]: E0310 14:02:58.193119 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.263688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.263784 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.263805 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.263835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.263854 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.367712 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.368448 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.368496 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.368523 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.368538 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.445636 4911 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.472654 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.472700 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.472712 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.472767 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.472783 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.575911 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.575973 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.575987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.576010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.576026 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.678787 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.678842 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.678852 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.678868 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.678879 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.781630 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.781716 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.781895 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.781924 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.781942 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.885081 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.885147 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.885160 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.885180 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.885193 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.988881 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.988972 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.989008 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.989042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:58 crc kubenswrapper[4911]: I0310 14:02:58.989066 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:58Z","lastTransitionTime":"2026-03-10T14:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.092107 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.092151 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.092169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.092193 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.092210 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.192874 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:02:59 crc kubenswrapper[4911]: E0310 14:02:59.193134 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.197329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.197418 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.197438 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.197468 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.197486 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.301821 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.301891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.301912 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.301938 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.301956 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.405839 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.405931 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.405965 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.406001 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.406023 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.509914 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.510065 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.510095 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.510123 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.510141 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.614467 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.614521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.614550 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.614582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.614601 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.717692 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.717792 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.717810 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.717836 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.717857 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.822211 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.822267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.822285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.822316 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.822336 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.926846 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.926918 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.927123 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.927156 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.927177 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:02:59Z","lastTransitionTime":"2026-03-10T14:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:02:59 crc kubenswrapper[4911]: I0310 14:02:59.953044 4911 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.030506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.030575 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.030592 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.030617 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.030638 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.133773 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.133805 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.133814 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.133827 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.133836 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.193524 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.193657 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.193937 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.194114 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.194355 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.194659 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.197113 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:03:00 crc kubenswrapper[4911]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: set -o allexport Mar 10 14:03:00 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:03:00 crc kubenswrapper[4911]: set +o allexport Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 14:03:00 crc kubenswrapper[4911]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 14:03:00 crc kubenswrapper[4911]: ho_enable="--enable-hybrid-overlay" Mar 10 14:03:00 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 14:03:00 crc kubenswrapper[4911]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 14:03:00 crc kubenswrapper[4911]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 14:03:00 crc kubenswrapper[4911]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 14:03:00 crc kubenswrapper[4911]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 14:03:00 crc kubenswrapper[4911]: --webhook-host=127.0.0.1 \ Mar 10 14:03:00 crc kubenswrapper[4911]: --webhook-port=9743 \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${ho_enable} \ Mar 10 14:03:00 crc kubenswrapper[4911]: --enable-interconnect \ Mar 10 14:03:00 crc kubenswrapper[4911]: --disable-approver \ Mar 10 14:03:00 crc kubenswrapper[4911]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 14:03:00 crc kubenswrapper[4911]: --wait-for-kubernetes-api=200s \ Mar 10 14:03:00 crc kubenswrapper[4911]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 14:03:00 crc kubenswrapper[4911]: --loglevel="${LOGLEVEL}" Mar 10 14:03:00 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:03:00 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.197223 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:03:00 crc kubenswrapper[4911]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 14:03:00 crc kubenswrapper[4911]: set -euo pipefail Mar 10 14:03:00 crc kubenswrapper[4911]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 14:03:00 crc kubenswrapper[4911]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 14:03:00 crc kubenswrapper[4911]: # As the secret mount is optional we must wait for the files to be present. Mar 10 14:03:00 crc kubenswrapper[4911]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 14:03:00 crc kubenswrapper[4911]: TS=$(date +%s) Mar 10 14:03:00 crc kubenswrapper[4911]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 14:03:00 crc kubenswrapper[4911]: HAS_LOGGED_INFO=0 Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: log_missing_certs(){ Mar 10 14:03:00 crc kubenswrapper[4911]: CUR_TS=$(date +%s) Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 14:03:00 crc kubenswrapper[4911]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 14:03:00 crc kubenswrapper[4911]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 14:03:00 crc kubenswrapper[4911]: HAS_LOGGED_INFO=1 Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: } Mar 10 14:03:00 crc kubenswrapper[4911]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 14:03:00 crc kubenswrapper[4911]: log_missing_certs Mar 10 14:03:00 crc kubenswrapper[4911]: sleep 5 Mar 10 14:03:00 crc kubenswrapper[4911]: done Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 14:03:00 crc kubenswrapper[4911]: exec /usr/bin/kube-rbac-proxy \ Mar 10 14:03:00 crc kubenswrapper[4911]: --logtostderr \ Mar 10 14:03:00 crc kubenswrapper[4911]: --secure-listen-address=:9108 \ Mar 10 14:03:00 crc kubenswrapper[4911]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 14:03:00 crc kubenswrapper[4911]: --upstream=http://127.0.0.1:29108/ \ Mar 10 14:03:00 crc kubenswrapper[4911]: --tls-private-key-file=${TLS_PK} \ Mar 10 14:03:00 crc kubenswrapper[4911]: --tls-cert-file=${TLS_CERT} Mar 10 14:03:00 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-xxbj8_openshift-ovn-kubernetes(7cae19a2-5349-4f13-94ab-bfe066e4589a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:03:00 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.200609 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:03:00 crc kubenswrapper[4911]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: set -o allexport Mar 10 14:03:00 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:03:00 crc kubenswrapper[4911]: set +o allexport Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 14:03:00 crc kubenswrapper[4911]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 14:03:00 crc kubenswrapper[4911]: --disable-webhook \ Mar 10 14:03:00 crc kubenswrapper[4911]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 14:03:00 crc kubenswrapper[4911]: --loglevel="${LOGLEVEL}" Mar 10 14:03:00 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:03:00 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.201323 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:03:00 crc kubenswrapper[4911]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ -f "/env/_master" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: set -o allexport Mar 10 14:03:00 crc kubenswrapper[4911]: source "/env/_master" Mar 10 14:03:00 crc kubenswrapper[4911]: set +o allexport Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: ovn_v4_join_subnet_opt= Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: ovn_v6_join_subnet_opt= Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: ovn_v4_transit_switch_subnet_opt= Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: ovn_v6_transit_switch_subnet_opt= Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ "" != "" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: dns_name_resolver_enabled_flag= Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ "false" == "true" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: persistent_ips_enabled_flag= Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ "true" == "true" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: # This is needed so that converting clusters from GA to TP Mar 10 14:03:00 crc kubenswrapper[4911]: # will rollout control plane pods as well Mar 10 14:03:00 crc kubenswrapper[4911]: network_segmentation_enabled_flag= Mar 10 14:03:00 crc kubenswrapper[4911]: multi_network_enabled_flag= Mar 10 14:03:00 crc kubenswrapper[4911]: if [[ "true" == "true" ]]; then Mar 10 14:03:00 crc kubenswrapper[4911]: multi_network_enabled_flag="--enable-multi-network" Mar 10 14:03:00 crc kubenswrapper[4911]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 14:03:00 crc kubenswrapper[4911]: fi Mar 10 14:03:00 crc kubenswrapper[4911]: Mar 10 14:03:00 crc kubenswrapper[4911]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 14:03:00 crc kubenswrapper[4911]: exec /usr/bin/ovnkube \ Mar 10 14:03:00 crc kubenswrapper[4911]: --enable-interconnect \ Mar 10 14:03:00 crc kubenswrapper[4911]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 14:03:00 crc kubenswrapper[4911]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 14:03:00 crc kubenswrapper[4911]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 14:03:00 crc kubenswrapper[4911]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 14:03:00 crc kubenswrapper[4911]: --metrics-enable-pprof \ Mar 10 14:03:00 crc kubenswrapper[4911]: --metrics-enable-config-duration \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${ovn_v4_join_subnet_opt} \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${ovn_v6_join_subnet_opt} \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${dns_name_resolver_enabled_flag} \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${persistent_ips_enabled_flag} \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${multi_network_enabled_flag} \ Mar 10 14:03:00 crc kubenswrapper[4911]: ${network_segmentation_enabled_flag} Mar 10 14:03:00 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n2ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-xxbj8_openshift-ovn-kubernetes(7cae19a2-5349-4f13-94ab-bfe066e4589a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:03:00 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.201776 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 14:03:00 crc kubenswrapper[4911]: E0310 14:03:00.202633 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" podUID="7cae19a2-5349-4f13-94ab-bfe066e4589a" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.238248 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.238340 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.238359 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.238384 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.238426 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.342367 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.342442 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.342460 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.342489 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.342509 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.446390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.446497 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.446523 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.446558 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.446585 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.550717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.550816 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.550833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.550863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.550883 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.655293 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.655405 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.655426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.655460 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.655479 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.758608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.758679 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.758696 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.758739 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.758760 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.862793 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.862872 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.862892 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.862923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.862945 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.966574 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.966694 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.966758 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.966796 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:00 crc kubenswrapper[4911]: I0310 14:03:00.966820 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:00Z","lastTransitionTime":"2026-03-10T14:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.070095 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.070194 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.070221 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.070255 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.070286 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.173304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.173357 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.173371 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.173392 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.173409 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.193391 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:01 crc kubenswrapper[4911]: E0310 14:03:01.193893 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:01 crc kubenswrapper[4911]: E0310 14:03:01.195501 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:03:01 crc kubenswrapper[4911]: E0310 14:03:01.195900 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:03:01 crc kubenswrapper[4911]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 14:03:01 crc kubenswrapper[4911]: set -uo pipefail Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 14:03:01 crc kubenswrapper[4911]: HOSTS_FILE="/etc/hosts" Mar 10 14:03:01 crc kubenswrapper[4911]: TEMP_FILE="/etc/hosts.tmp" Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: # Make a temporary file with the old hosts file's attributes. Mar 10 14:03:01 crc kubenswrapper[4911]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 14:03:01 crc kubenswrapper[4911]: echo "Failed to preserve hosts file. Exiting." Mar 10 14:03:01 crc kubenswrapper[4911]: exit 1 Mar 10 14:03:01 crc kubenswrapper[4911]: fi Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: while true; do Mar 10 14:03:01 crc kubenswrapper[4911]: declare -A svc_ips Mar 10 14:03:01 crc kubenswrapper[4911]: for svc in "${services[@]}"; do Mar 10 14:03:01 crc kubenswrapper[4911]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 14:03:01 crc kubenswrapper[4911]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 14:03:01 crc kubenswrapper[4911]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 14:03:01 crc kubenswrapper[4911]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 14:03:01 crc kubenswrapper[4911]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:03:01 crc kubenswrapper[4911]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:03:01 crc kubenswrapper[4911]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 14:03:01 crc kubenswrapper[4911]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 14:03:01 crc kubenswrapper[4911]: for i in ${!cmds[*]} Mar 10 14:03:01 crc kubenswrapper[4911]: do Mar 10 14:03:01 crc kubenswrapper[4911]: ips=($(eval "${cmds[i]}")) Mar 10 14:03:01 crc kubenswrapper[4911]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 14:03:01 crc kubenswrapper[4911]: svc_ips["${svc}"]="${ips[@]}" Mar 10 14:03:01 crc kubenswrapper[4911]: break Mar 10 14:03:01 crc kubenswrapper[4911]: fi Mar 10 14:03:01 crc kubenswrapper[4911]: done Mar 10 14:03:01 crc kubenswrapper[4911]: done Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: # Update /etc/hosts only if we get valid service IPs Mar 10 14:03:01 crc kubenswrapper[4911]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 14:03:01 crc kubenswrapper[4911]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 14:03:01 crc kubenswrapper[4911]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 14:03:01 crc kubenswrapper[4911]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 14:03:01 crc kubenswrapper[4911]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 14:03:01 crc kubenswrapper[4911]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 14:03:01 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:03:01 crc kubenswrapper[4911]: continue Mar 10 14:03:01 crc kubenswrapper[4911]: fi Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: # Append resolver entries for services Mar 10 14:03:01 crc kubenswrapper[4911]: rc=0 Mar 10 14:03:01 crc kubenswrapper[4911]: for svc in "${!svc_ips[@]}"; do Mar 10 14:03:01 crc kubenswrapper[4911]: for ip in ${svc_ips[${svc}]}; do Mar 10 14:03:01 crc kubenswrapper[4911]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 14:03:01 crc kubenswrapper[4911]: done Mar 10 14:03:01 crc kubenswrapper[4911]: done Mar 10 14:03:01 crc kubenswrapper[4911]: if [[ $rc -ne 0 ]]; then Mar 10 14:03:01 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:03:01 crc kubenswrapper[4911]: continue Mar 10 14:03:01 crc kubenswrapper[4911]: fi Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: Mar 10 14:03:01 crc kubenswrapper[4911]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 14:03:01 crc kubenswrapper[4911]: # Replace /etc/hosts with our modified version if needed Mar 10 14:03:01 crc kubenswrapper[4911]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 14:03:01 crc kubenswrapper[4911]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 14:03:01 crc kubenswrapper[4911]: fi Mar 10 14:03:01 crc kubenswrapper[4911]: sleep 60 & wait Mar 10 14:03:01 crc kubenswrapper[4911]: unset svc_ips Mar 10 14:03:01 crc kubenswrapper[4911]: done Mar 10 14:03:01 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-87sf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9p6ll_openshift-dns(a0fdaa42-a77f-4f62-b94f-6659225e12af): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:03:01 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:03:01 crc kubenswrapper[4911]: E0310 14:03:01.197101 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9p6ll" podUID="a0fdaa42-a77f-4f62-b94f-6659225e12af" Mar 10 14:03:01 crc kubenswrapper[4911]: E0310 14:03:01.197146 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.213179 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.276200 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.276247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.276260 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.276283 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.276297 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.379423 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.379477 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.379488 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.379509 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.379523 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.482426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.482471 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.482481 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.482499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.482511 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.585449 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.585484 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.585495 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.585510 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.585521 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.688167 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.688214 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.688225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.688241 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.688252 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.790511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.790577 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.790598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.790625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.790645 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.893257 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.893313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.893326 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.893345 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.893359 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.996044 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.996089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.996099 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.996114 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:01 crc kubenswrapper[4911]: I0310 14:03:01.996123 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:01Z","lastTransitionTime":"2026-03-10T14:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.098649 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.098694 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.098706 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.098751 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.098763 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.192956 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.193010 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.193010 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:02 crc kubenswrapper[4911]: E0310 14:03:02.193212 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:02 crc kubenswrapper[4911]: E0310 14:03:02.193283 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:02 crc kubenswrapper[4911]: E0310 14:03:02.193497 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:02 crc kubenswrapper[4911]: E0310 14:03:02.195546 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:03:02 crc kubenswrapper[4911]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 14:03:02 crc kubenswrapper[4911]: while [ true ]; Mar 10 14:03:02 crc kubenswrapper[4911]: do Mar 10 14:03:02 crc kubenswrapper[4911]: for f in $(ls /tmp/serviceca); do Mar 10 14:03:02 crc kubenswrapper[4911]: echo $f Mar 10 14:03:02 crc kubenswrapper[4911]: ca_file_path="/tmp/serviceca/${f}" Mar 10 14:03:02 crc kubenswrapper[4911]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 14:03:02 crc kubenswrapper[4911]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 14:03:02 crc kubenswrapper[4911]: if [ -e "${reg_dir_path}" ]; then Mar 10 14:03:02 crc kubenswrapper[4911]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 14:03:02 crc kubenswrapper[4911]: else Mar 10 14:03:02 crc kubenswrapper[4911]: mkdir $reg_dir_path Mar 10 14:03:02 crc kubenswrapper[4911]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 14:03:02 crc kubenswrapper[4911]: fi Mar 10 14:03:02 crc kubenswrapper[4911]: done Mar 10 14:03:02 crc kubenswrapper[4911]: for d in $(ls /etc/docker/certs.d); do Mar 10 14:03:02 crc kubenswrapper[4911]: echo $d Mar 10 14:03:02 crc kubenswrapper[4911]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 14:03:02 crc kubenswrapper[4911]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 14:03:02 crc kubenswrapper[4911]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 14:03:02 crc kubenswrapper[4911]: rm -rf /etc/docker/certs.d/$d Mar 10 14:03:02 crc kubenswrapper[4911]: fi Mar 10 14:03:02 crc kubenswrapper[4911]: done Mar 10 14:03:02 crc kubenswrapper[4911]: sleep 60 & wait ${!} Mar 10 14:03:02 crc kubenswrapper[4911]: done Mar 10 14:03:02 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dplb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vfj7m_openshift-image-registry(acf8e218-4a2a-4d62-9aa8-7ecca1109d35): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:03:02 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:03:02 crc kubenswrapper[4911]: E0310 14:03:02.197004 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vfj7m" podUID="acf8e218-4a2a-4d62-9aa8-7ecca1109d35" Mar 10 14:03:02 crc kubenswrapper[4911]: E0310 14:03:02.197426 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:03:02 crc kubenswrapper[4911]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 14:03:02 crc kubenswrapper[4911]: apiVersion: v1 Mar 10 14:03:02 crc kubenswrapper[4911]: clusters: Mar 10 14:03:02 crc kubenswrapper[4911]: - cluster: Mar 10 14:03:02 crc kubenswrapper[4911]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 14:03:02 crc kubenswrapper[4911]: server: https://api-int.crc.testing:6443 Mar 10 14:03:02 crc kubenswrapper[4911]: name: default-cluster Mar 10 14:03:02 crc kubenswrapper[4911]: contexts: Mar 10 14:03:02 crc kubenswrapper[4911]: - context: Mar 10 14:03:02 crc kubenswrapper[4911]: cluster: default-cluster Mar 10 14:03:02 crc kubenswrapper[4911]: namespace: default Mar 10 14:03:02 crc kubenswrapper[4911]: user: default-auth Mar 10 14:03:02 crc kubenswrapper[4911]: name: default-context Mar 10 14:03:02 crc kubenswrapper[4911]: current-context: default-context Mar 10 14:03:02 crc kubenswrapper[4911]: kind: Config Mar 10 14:03:02 crc kubenswrapper[4911]: preferences: {} Mar 10 14:03:02 crc kubenswrapper[4911]: users: Mar 10 14:03:02 crc kubenswrapper[4911]: - name: default-auth Mar 10 14:03:02 crc kubenswrapper[4911]: user: Mar 10 14:03:02 crc kubenswrapper[4911]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 14:03:02 crc kubenswrapper[4911]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 14:03:02 crc kubenswrapper[4911]: EOF Mar 10 14:03:02 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rh7d8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:03:02 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:03:02 crc kubenswrapper[4911]: E0310 14:03:02.199027 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.202117 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.202179 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.202199 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.202220 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.202242 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.304915 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.304980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.304999 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.305025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.305042 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.408070 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.408131 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.408147 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.408173 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.408190 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.511356 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.511415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.511430 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.511452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.511466 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.613555 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.613639 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.613663 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.613690 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.613708 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.717547 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.717624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.717645 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.717672 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.717692 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.821200 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.821279 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.821291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.821315 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.821329 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.924409 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.924473 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.924492 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.924515 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:02 crc kubenswrapper[4911]: I0310 14:03:02.924537 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:02Z","lastTransitionTime":"2026-03-10T14:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.027898 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.028299 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.028458 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.028611 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.028784 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.132209 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.132266 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.132285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.132309 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.132327 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.193138 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.193462 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.195459 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgrqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.195691 4911 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 14:03:03 crc kubenswrapper[4911]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 14:03:03 crc kubenswrapper[4911]: set -o allexport Mar 10 14:03:03 crc kubenswrapper[4911]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 14:03:03 crc kubenswrapper[4911]: source /etc/kubernetes/apiserver-url.env Mar 10 14:03:03 crc kubenswrapper[4911]: else Mar 10 14:03:03 crc kubenswrapper[4911]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 14:03:03 crc kubenswrapper[4911]: exit 1 Mar 10 14:03:03 crc kubenswrapper[4911]: fi Mar 10 14:03:03 crc kubenswrapper[4911]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 14:03:03 crc kubenswrapper[4911]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 14:03:03 crc kubenswrapper[4911]: > logger="UnhandledError" Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.196839 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.198553 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgrqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.199872 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.235157 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.235236 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.235262 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.235298 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.235326 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.338287 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.338347 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.338363 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.338387 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.338402 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.441993 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.442059 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.442078 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.442141 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.442166 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.546045 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.546120 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.546148 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.546181 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.546205 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.649394 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.649478 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.649514 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.649546 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.649566 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.752782 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.752863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.752884 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.752915 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.752934 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.856096 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.856200 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.856218 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.856246 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.856266 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.939157 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.939262 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.939370 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.939422 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939506 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939558 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939587 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939598 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939704 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:19.939666319 +0000 UTC m=+104.503186276 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939700 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939812 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939823 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:19.939799352 +0000 UTC m=+104.503319309 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939773 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.940014 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:19.939995778 +0000 UTC m=+104.503515735 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.939841 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:03 crc kubenswrapper[4911]: E0310 14:03:03.940184 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:19.940166972 +0000 UTC m=+104.503686939 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.951671 4911 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.959980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.960038 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.960059 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.960549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:03 crc kubenswrapper[4911]: I0310 14:03:03.960608 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:03Z","lastTransitionTime":"2026-03-10T14:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.064267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.064347 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.064371 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.064411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.064435 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.141712 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:03:04 crc kubenswrapper[4911]: E0310 14:03:04.141917 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:03:20.141881879 +0000 UTC m=+104.705401836 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.142024 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:04 crc kubenswrapper[4911]: E0310 14:03:04.142239 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:03:04 crc kubenswrapper[4911]: E0310 14:03:04.142366 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:20.142338911 +0000 UTC m=+104.705858858 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.168619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.168704 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.168789 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.168829 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.168850 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.192695 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.192910 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.193101 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:04 crc kubenswrapper[4911]: E0310 14:03:04.193368 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:04 crc kubenswrapper[4911]: E0310 14:03:04.193609 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:04 crc kubenswrapper[4911]: E0310 14:03:04.194098 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.272265 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.272331 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.272346 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.272367 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.272381 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.375270 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.375336 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.375355 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.375381 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.375399 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.479948 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.479984 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.479994 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.480032 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.480046 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.583881 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.583947 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.583964 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.583987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.584002 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.607202 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nsxjn" event={"ID":"fc662696-d402-4969-bebd-00fa42e63075","Type":"ContainerStarted","Data":"6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.609378 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerStarted","Data":"8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.621816 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.632199 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.646516 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.660250 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.670713 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.686703 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.686764 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.686774 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.686789 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.686798 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.692716 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.703853 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.717409 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.735412 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.745909 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.761358 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.773397 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.789907 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.789966 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.789979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.789998 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.790013 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.810328 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.837036 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.854201 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.861420 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.872442 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.884340 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.892767 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.892802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.892815 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.892835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.892847 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.898487 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.909444 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.917601 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.925166 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.937760 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.950267 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.961799 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.975530 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.983335 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.990364 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.994858 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.995017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.995090 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.995158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:04 crc kubenswrapper[4911]: I0310 14:03:04.995214 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:04Z","lastTransitionTime":"2026-03-10T14:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.007902 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.015813 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.023861 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.032554 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.098034 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.098077 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.098094 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.098120 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.098142 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.193199 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:05 crc kubenswrapper[4911]: E0310 14:03:05.193385 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.200223 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.200253 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.200262 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.200275 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.200284 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.302452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.302800 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.302813 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.302826 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.302834 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.405861 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.405907 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.405916 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.405928 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.406123 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.510184 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.510250 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.510269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.510294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.510313 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.612628 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.612696 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.612709 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.612749 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.612762 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.615229 4911 generic.go:334] "Generic (PLEG): container finished" podID="349ee2ee-803a-404b-9aa2-2230eabdbb56" containerID="8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb" exitCode=0 Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.615339 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerDied","Data":"8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.634349 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.647223 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.663826 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.675846 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.684893 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.697832 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.715107 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.720631 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.720675 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.720690 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.720713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.720745 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.725880 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.734834 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.744636 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.751688 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.758156 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.773892 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.784457 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.793517 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.802925 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.822965 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.823006 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.823019 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.823034 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.823047 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.927060 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.927415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.927427 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.927448 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:05 crc kubenswrapper[4911]: I0310 14:03:05.927462 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:05Z","lastTransitionTime":"2026-03-10T14:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.031252 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.031351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.031378 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.031417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.031443 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.133405 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.133450 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.133460 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.133476 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.133486 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.192505 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.192669 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.192680 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.192782 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.192876 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.193121 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.209695 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.221654 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.222823 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.222876 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.222891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.222921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.222932 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.234162 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.237101 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.240745 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.240775 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.240786 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.240802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.240815 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.246170 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.251260 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.256942 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.259872 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.259937 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.259967 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.261629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.261705 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.273825 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.278176 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.278220 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.278238 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.278262 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.278279 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.290529 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.296025 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.301747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.301786 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.301796 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.301811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.301821 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.304848 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.311496 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: E0310 14:03:06.311621 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.313056 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.313075 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.313084 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.313098 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.313108 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.315672 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.327703 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.340279 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.350233 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.359622 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.376056 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.389953 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.399574 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.408248 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.416342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.416377 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.416387 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.416405 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.416417 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.518396 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.518437 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.518450 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.518471 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.518484 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.620375 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.620424 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.620437 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.620453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.620465 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.621319 4911 generic.go:334] "Generic (PLEG): container finished" podID="349ee2ee-803a-404b-9aa2-2230eabdbb56" containerID="88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b" exitCode=0 Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.621356 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerDied","Data":"88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.641031 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.661329 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.678060 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.693202 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.706494 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.726603 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.726662 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.726682 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.726712 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.726764 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.727434 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.746767 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.764785 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.783988 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.805145 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.815657 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.829466 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.829535 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.829554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.829581 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.829599 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.844585 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.860617 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.875973 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.888199 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.900575 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.933371 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.933449 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.933466 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.933492 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:06 crc kubenswrapper[4911]: I0310 14:03:06.933508 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:06Z","lastTransitionTime":"2026-03-10T14:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.037055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.037124 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.037146 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.037173 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.037192 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.147158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.147687 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.147972 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.148153 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.148298 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.193287 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:07 crc kubenswrapper[4911]: E0310 14:03:07.193492 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.251626 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.252163 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.252304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.252521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.252685 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.355997 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.356281 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.356353 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.356429 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.356490 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.459663 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.459982 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.460114 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.460196 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.460263 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.565045 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.565116 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.565134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.565160 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.565179 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.628587 4911 generic.go:334] "Generic (PLEG): container finished" podID="349ee2ee-803a-404b-9aa2-2230eabdbb56" containerID="e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603" exitCode=0 Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.628673 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerDied","Data":"e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.643690 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.665937 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.674848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.674915 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.674931 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.674953 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.674968 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.683144 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.694958 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.708961 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.722763 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.746505 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.758814 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.770544 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.780330 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.780388 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.780400 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.780427 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.780442 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.786347 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.797209 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.813651 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.825004 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.837758 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.850218 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.874894 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.883029 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.883074 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.883098 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.883116 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.883126 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.987827 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.987923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.987950 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.987984 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:07 crc kubenswrapper[4911]: I0310 14:03:07.988006 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:07Z","lastTransitionTime":"2026-03-10T14:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.092117 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.092206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.092219 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.092247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.092261 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.193149 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.193245 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.193346 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:08 crc kubenswrapper[4911]: E0310 14:03:08.193531 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:08 crc kubenswrapper[4911]: E0310 14:03:08.194369 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:08 crc kubenswrapper[4911]: E0310 14:03:08.194554 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.196878 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.196950 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.196970 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.196997 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.197016 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.301399 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.301470 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.301491 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.301518 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.301542 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.404487 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.404544 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.404557 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.404579 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.404592 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.507936 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.507984 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.507995 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.508014 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.508027 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.611903 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.612004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.612022 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.612051 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.612075 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.635633 4911 generic.go:334] "Generic (PLEG): container finished" podID="349ee2ee-803a-404b-9aa2-2230eabdbb56" containerID="5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467" exitCode=0 Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.635698 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerDied","Data":"5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.645690 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.676322 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.685791 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.697525 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.707171 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.714010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.714059 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.714072 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.714091 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.714105 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.717717 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.729055 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.739454 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.750475 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.761082 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.769607 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.786858 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.804522 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.816309 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.816369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.816380 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.816397 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.816409 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.821317 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.832627 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.845952 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.919568 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.919629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.919640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.919688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:08 crc kubenswrapper[4911]: I0310 14:03:08.919704 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:08Z","lastTransitionTime":"2026-03-10T14:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.022598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.022654 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.022663 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.022676 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.022686 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.125879 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.125921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.125931 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.125946 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.125956 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.192613 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:09 crc kubenswrapper[4911]: E0310 14:03:09.192849 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.194020 4911 scope.go:117] "RemoveContainer" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" Mar 10 14:03:09 crc kubenswrapper[4911]: E0310 14:03:09.194348 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.228586 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.228623 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.228632 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.228646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.228655 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.332582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.332635 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.332648 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.332682 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.332697 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.435627 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.435702 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.435756 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.435787 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.435808 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.538549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.538632 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.538660 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.538710 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.538772 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.642029 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.642118 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.642139 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.642168 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.642189 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.645801 4911 generic.go:334] "Generic (PLEG): container finished" podID="349ee2ee-803a-404b-9aa2-2230eabdbb56" containerID="497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1" exitCode=0 Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.645912 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerDied","Data":"497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.664673 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.679603 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.696808 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.716790 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.734597 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.744917 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.744962 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.744973 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.744987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.744997 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.745140 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.755876 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.770018 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.783278 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.794003 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.806326 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.816351 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.846689 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.848157 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.848194 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.848206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.848224 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.848235 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.858367 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.867632 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.879535 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.951210 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.951272 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.951283 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.951303 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:09 crc kubenswrapper[4911]: I0310 14:03:09.951318 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:09Z","lastTransitionTime":"2026-03-10T14:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.054397 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.054877 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.054995 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.055162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.055261 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.158230 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.158275 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.158285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.158302 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.158313 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.192477 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:10 crc kubenswrapper[4911]: E0310 14:03:10.192623 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.193893 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:10 crc kubenswrapper[4911]: E0310 14:03:10.193990 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.194190 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:10 crc kubenswrapper[4911]: E0310 14:03:10.194459 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.262940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.263040 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.263058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.263124 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.263145 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.367403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.367533 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.367603 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.367640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.367701 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.471054 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.471132 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.471150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.471613 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.471638 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.574234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.574286 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.574297 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.574315 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.574327 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.654283 4911 generic.go:334] "Generic (PLEG): container finished" podID="349ee2ee-803a-404b-9aa2-2230eabdbb56" containerID="867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857" exitCode=0 Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.654338 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerDied","Data":"867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.667540 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.676821 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.676860 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.676878 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.676902 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.676922 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.686208 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.700271 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.715829 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.732085 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.744353 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.757115 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.778147 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.781452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.781493 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.781511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.781537 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.781553 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.794353 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.804503 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.814036 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.824612 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.836543 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.855678 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.864933 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.877059 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.884535 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.884585 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.884596 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.884622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.884640 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.987579 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.987644 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.987661 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.987690 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:10 crc kubenswrapper[4911]: I0310 14:03:10.987706 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:10Z","lastTransitionTime":"2026-03-10T14:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.090383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.090427 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.090437 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.090453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.090463 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.192406 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:11 crc kubenswrapper[4911]: E0310 14:03:11.192825 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.192872 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.192890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.192899 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.192910 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.192921 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.296372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.296448 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.296461 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.296481 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.296516 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.399782 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.399820 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.399832 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.399855 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.399866 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.503535 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.503652 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.503666 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.503684 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.503696 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.607335 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.607412 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.607435 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.607467 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.607486 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.664593 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" event={"ID":"349ee2ee-803a-404b-9aa2-2230eabdbb56","Type":"ContainerStarted","Data":"9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.667831 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.667890 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.679658 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.701375 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.709950 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.709992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.710043 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.710067 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.710082 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.718382 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.734512 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.750074 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.759758 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.769312 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.779072 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.791116 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.802335 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.813070 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.813134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.813179 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.813202 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.813216 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.814952 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.823168 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.839104 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.851027 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.867194 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.885760 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.904769 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.916434 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.916489 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.916502 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.916528 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.916542 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:11Z","lastTransitionTime":"2026-03-10T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.928566 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.951399 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.967748 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:11 crc kubenswrapper[4911]: I0310 14:03:11.984233 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.015939 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.019636 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.019798 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.019881 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.019969 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.020074 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.035794 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.053647 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.068652 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.084767 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.101410 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.120596 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.123403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.123451 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.123462 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.123483 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.123493 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.139862 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.154374 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.165984 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.184873 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.192387 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.192528 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:12 crc kubenswrapper[4911]: E0310 14:03:12.192557 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.192815 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:12 crc kubenswrapper[4911]: E0310 14:03:12.192829 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:12 crc kubenswrapper[4911]: E0310 14:03:12.193223 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.212120 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.226501 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.226572 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.226590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.226617 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.226636 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.329428 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.329510 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.329530 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.329561 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.329582 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.432586 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.432655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.432672 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.432701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.432748 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.535826 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.535876 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.535890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.535907 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.535919 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.638149 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.638201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.638215 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.638239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.638258 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.742440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.743113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.743207 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.743303 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.743395 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.846523 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.846590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.846615 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.846646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.846668 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.951411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.951450 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.951459 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.951473 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:12 crc kubenswrapper[4911]: I0310 14:03:12.951482 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:12Z","lastTransitionTime":"2026-03-10T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.054444 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.054475 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.054483 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.054497 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.054505 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.157111 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.157160 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.157171 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.157188 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.157204 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.192858 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:13 crc kubenswrapper[4911]: E0310 14:03:13.193073 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.259155 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.259597 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.259830 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.260030 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.260185 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.362845 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.362887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.362897 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.362911 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.362921 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.465890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.465955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.465972 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.465997 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.466017 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.568940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.568982 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.568992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.569008 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.569020 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.672205 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.672442 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.672484 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.672518 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.672542 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.776045 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.776123 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.776142 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.776169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.776190 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.879222 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.879546 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.879613 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.879695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.879788 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.982407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.982466 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.982484 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.982510 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:13 crc kubenswrapper[4911]: I0310 14:03:13.982527 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:13Z","lastTransitionTime":"2026-03-10T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.085190 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.085251 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.085263 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.085283 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.085305 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.187959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.188008 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.188020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.188039 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.188052 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.192516 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.192553 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.192602 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:14 crc kubenswrapper[4911]: E0310 14:03:14.192780 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:14 crc kubenswrapper[4911]: E0310 14:03:14.193146 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:14 crc kubenswrapper[4911]: E0310 14:03:14.193509 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.290900 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.290948 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.290959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.290976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.290987 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.393582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.393623 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.393631 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.393645 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.393656 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.497047 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.497086 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.497096 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.497113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.497123 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.600063 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.600130 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.600667 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.600819 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.600854 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.687502 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" event={"ID":"7cae19a2-5349-4f13-94ab-bfe066e4589a","Type":"ContainerStarted","Data":"e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.687616 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" event={"ID":"7cae19a2-5349-4f13-94ab-bfe066e4589a","Type":"ContainerStarted","Data":"5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.689892 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vfj7m" event={"ID":"acf8e218-4a2a-4d62-9aa8-7ecca1109d35","Type":"ContainerStarted","Data":"13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.704824 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.704866 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.704877 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.704901 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.704917 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.706260 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.743126 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.760966 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.783554 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.801046 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.807395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.807422 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.807433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.807453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.807467 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.814907 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.835497 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.857286 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.876769 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.910863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.910941 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.910959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.910986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.911007 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:14Z","lastTransitionTime":"2026-03-10T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.912523 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.942765 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.966354 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.982826 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:14 crc kubenswrapper[4911]: I0310 14:03:14.996761 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:14Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.012002 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.013530 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.013591 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.013605 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.013628 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.013642 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.024807 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.051604 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.067619 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.080563 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.094880 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.109138 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.116517 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.116590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.116613 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.116642 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.116662 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.123306 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.147634 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.163495 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.193212 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:15 crc kubenswrapper[4911]: E0310 14:03:15.193682 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.197968 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.215406 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.220383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.220443 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.220465 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.220502 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.220526 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.237765 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.254370 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.268078 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.294199 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.318469 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.323614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.323701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.323766 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.323802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.323826 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.338697 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.355502 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.373120 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.426771 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.426833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.426852 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.426880 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.426900 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.529966 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.530030 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.530043 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.530064 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.530077 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.633420 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.633483 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.633499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.633519 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.633533 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.703475 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.731069 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.737392 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.737473 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.737500 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.737533 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.737552 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.755961 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.780231 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.801623 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.821103 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.841148 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.841223 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.841241 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.841271 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.841294 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.859177 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.878315 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.917058 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.937821 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.944492 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.944582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.944611 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.944646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.944670 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:15Z","lastTransitionTime":"2026-03-10T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.962201 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:15 crc kubenswrapper[4911]: I0310 14:03:15.987495 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:15Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.008983 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.034176 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.048479 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.048540 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.048570 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.048609 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.048632 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.060317 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.082118 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.099685 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.128190 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.152408 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.152492 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.152532 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.152566 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.152584 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.192902 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.192902 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.193084 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.193214 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.193506 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.194084 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.219678 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.239847 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.258299 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.258399 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.258424 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.258461 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.258489 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.274764 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.295999 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.325856 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.349692 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.361306 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.361342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.361352 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.361369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.361382 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.370801 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.389207 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.409520 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.428434 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.449360 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.462766 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.465169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.465222 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.465235 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.465258 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.465273 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.478458 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.492986 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.504377 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.518143 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.532410 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.568369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.568444 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.568463 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.568513 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.568534 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.673673 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.673789 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.673810 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.673842 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.673872 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.707241 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.707306 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.707336 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.707366 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.707386 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.709821 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9p6ll" event={"ID":"a0fdaa42-a77f-4f62-b94f-6659225e12af","Type":"ContainerStarted","Data":"1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2"} Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.732225 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.736424 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.739321 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.739385 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.739408 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.739445 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.739526 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.763400 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.763713 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.769216 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.769286 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.769302 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.769320 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.769353 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.788239 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.794933 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.801994 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.802058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.802073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.802134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.802150 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.805192 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.824520 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.828088 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.829622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.829668 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.829680 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.829701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.829715 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.844537 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: E0310 14:03:16.844683 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.846853 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.846891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.846905 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.846920 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.846933 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.847302 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.878834 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.892923 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.910048 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.923457 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.936705 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.949622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.949666 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.949678 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.949693 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.949706 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:16Z","lastTransitionTime":"2026-03-10T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.950097 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.968269 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:16 crc kubenswrapper[4911]: I0310 14:03:16.986375 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.001193 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.018908 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.041044 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.052515 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.052566 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.052578 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.052599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.052612 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.156230 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.156292 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.156304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.156327 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.156342 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.193127 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:17 crc kubenswrapper[4911]: E0310 14:03:17.193831 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.258794 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.258867 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.258885 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.258919 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.258946 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.361779 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.361833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.361847 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.361875 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.362095 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.464468 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.464509 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.464520 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.464542 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.464556 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.567466 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.567554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.567568 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.567591 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.567606 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.670642 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.670755 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.670778 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.670810 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.670831 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.714577 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64" exitCode=0 Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.714663 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.740390 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.759155 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.785107 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.785187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.785211 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.785248 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.785278 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.818202 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.866161 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.880682 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.889430 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.889473 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.889487 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.889513 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.889526 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.896271 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.914368 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.928543 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.940207 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.961787 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.979865 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.993178 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.993224 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.993241 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.993259 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.993274 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:17Z","lastTransitionTime":"2026-03-10T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:17 crc kubenswrapper[4911]: I0310 14:03:17.997761 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:17Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.015206 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.028553 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.047556 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.064251 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.084357 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.096360 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.096430 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.096443 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.096469 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.096485 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.193355 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.193536 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.193359 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:18 crc kubenswrapper[4911]: E0310 14:03:18.193699 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:18 crc kubenswrapper[4911]: E0310 14:03:18.193532 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:18 crc kubenswrapper[4911]: E0310 14:03:18.193835 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.202460 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.202513 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.202530 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.202554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.202572 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.306325 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.306385 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.306403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.306429 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.306447 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.410862 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.410929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.410952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.410981 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.411002 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.514506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.514570 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.514587 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.514617 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.514641 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.618656 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.618706 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.618747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.618773 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.618790 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.721318 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.721372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.721389 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.721414 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.721432 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.722675 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.722773 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.731625 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.731699 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.731763 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.731787 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.731805 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.731822 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.750187 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.775820 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.799151 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.824794 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.824863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.824883 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.824911 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.824930 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.825416 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.845174 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.878999 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.898340 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.928438 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.928496 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.928520 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.928553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.928577 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:18Z","lastTransitionTime":"2026-03-10T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.934036 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.956159 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.979363 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:18 crc kubenswrapper[4911]: I0310 14:03:18.997965 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.013496 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:19Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.032411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.032452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.032469 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.032504 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.032520 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.034689 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:19Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.053985 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:19Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.075116 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:19Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.101643 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:19Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.122961 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:19Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.136328 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.136386 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.136405 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.136432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.136450 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.192925 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:19 crc kubenswrapper[4911]: E0310 14:03:19.193118 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.239185 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.239246 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.239265 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.239291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.239310 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.343080 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.343152 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.343172 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.343206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.343226 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.446353 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.446420 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.446438 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.446463 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.446483 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.550476 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.550880 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.550889 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.550906 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.550917 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.654065 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.654114 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.654125 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.654143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.654155 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.756677 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.757034 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.757099 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.757159 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.757229 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.860873 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.861884 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.861926 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.861958 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.861978 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.965409 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.965454 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.965466 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.965486 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:19 crc kubenswrapper[4911]: I0310 14:03:19.965499 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:19Z","lastTransitionTime":"2026-03-10T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.036833 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.036895 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.036966 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.036999 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037115 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037163 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037183 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037264 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:52.037233546 +0000 UTC m=+136.600753663 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037307 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037401 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:52.037372219 +0000 UTC m=+136.600892326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037508 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037527 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037544 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037587 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:52.037574515 +0000 UTC m=+136.601094642 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037634 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.037672 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:52.037660577 +0000 UTC m=+136.601180504 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.068341 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.068386 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.068397 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.068417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.068432 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.171408 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.171485 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.171505 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.171600 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.171683 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.192851 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.193007 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.193048 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.193138 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.193403 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.193629 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.238741 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.238838 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:03:52.238813149 +0000 UTC m=+136.802333066 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.239045 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.239210 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: E0310 14:03:20.239283 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:03:52.239266191 +0000 UTC m=+136.802786128 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.275637 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.275690 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.275703 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.275747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.275762 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.379746 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.379808 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.379830 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.379859 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.379884 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.483329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.483407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.483432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.483463 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.483483 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.587338 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.587691 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.587710 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.587769 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.587796 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.691506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.691573 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.691599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.691632 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.691654 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.742795 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.753884 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.765224 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.787592 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.795246 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.795308 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.795330 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.795364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.795385 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.811683 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.832306 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.852154 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.889302 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.898632 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.898759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.898781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.898817 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.898844 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:20Z","lastTransitionTime":"2026-03-10T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.909100 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.956460 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.974126 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:20 crc kubenswrapper[4911]: I0310 14:03:20.994243 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:20Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.001551 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.001623 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.001646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.001673 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.001707 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.010018 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:21Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.033855 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:21Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.054133 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:21Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.074643 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:21Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.094484 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:21Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.105273 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.105327 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.105342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.105361 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.105374 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.112535 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:21Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.133912 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:21Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.192833 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:21 crc kubenswrapper[4911]: E0310 14:03:21.193152 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.209247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.209325 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.209337 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.209358 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.209368 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.313858 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.313947 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.314004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.314043 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.314067 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.418469 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.418593 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.418620 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.418659 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.418687 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.521949 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.522016 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.522031 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.522053 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.522068 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.625400 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.625455 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.625467 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.625489 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.625504 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.729517 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.729590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.729612 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.729640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.729664 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.833558 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.833641 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.833662 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.833697 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.833721 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.936162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.936213 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.936225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.936250 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:21 crc kubenswrapper[4911]: I0310 14:03:21.936267 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:21Z","lastTransitionTime":"2026-03-10T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.040562 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.040679 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.040705 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.040767 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.040832 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.144373 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.144476 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.144499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.144528 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.144557 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.192866 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.193086 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:22 crc kubenswrapper[4911]: E0310 14:03:22.193176 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.192902 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:22 crc kubenswrapper[4911]: E0310 14:03:22.193330 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:22 crc kubenswrapper[4911]: E0310 14:03:22.193472 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.248748 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.248837 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.248860 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.248898 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.248911 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.352931 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.353023 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.353048 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.353085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.353110 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.456802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.456902 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.456923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.456954 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.456978 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.561220 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.561391 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.561416 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.561445 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.561466 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.665170 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.665247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.665268 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.665304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.665323 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.771962 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.772020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.772039 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.772070 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.772095 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.875345 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.875657 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.875671 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.875693 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.875708 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.979024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.979126 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.979163 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.979201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:22 crc kubenswrapper[4911]: I0310 14:03:22.979229 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:22Z","lastTransitionTime":"2026-03-10T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.082377 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.082440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.082458 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.082487 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.082507 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.185979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.186090 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.186109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.186139 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.186158 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.193302 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:23 crc kubenswrapper[4911]: E0310 14:03:23.193558 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.288872 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.288926 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.288942 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.288964 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.288980 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.393213 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.393272 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.393288 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.393312 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.393356 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.496879 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.496928 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.496940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.496959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.496970 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.600489 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.600542 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.600551 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.600571 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.600582 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.703906 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.703970 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.703983 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.704017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.704046 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.775621 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.776008 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.798281 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.807136 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.807190 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.807204 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.807225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.807237 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.815048 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.822643 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.835512 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.849203 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.863944 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.886294 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.901357 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.910909 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.910945 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.910955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.910972 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.910983 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:23Z","lastTransitionTime":"2026-03-10T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.920643 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.939384 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.953585 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.967236 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:23 crc kubenswrapper[4911]: I0310 14:03:23.995276 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:23Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.013659 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.013765 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.013786 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.013820 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.013844 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.014323 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.037069 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.056181 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.075470 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.093826 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.117797 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.117866 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.117883 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.117912 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.117933 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.122281 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.146065 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.166585 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.187244 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.192436 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.192652 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.192828 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:24 crc kubenswrapper[4911]: E0310 14:03:24.192783 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:24 crc kubenswrapper[4911]: E0310 14:03:24.193292 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.193881 4911 scope.go:117] "RemoveContainer" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" Mar 10 14:03:24 crc kubenswrapper[4911]: E0310 14:03:24.193707 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.211627 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.220472 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.220499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.220511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.220527 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.220538 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.229645 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.246266 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.270004 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.288746 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.306577 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.324309 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.325221 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.325326 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.325345 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.325379 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.325399 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.342326 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.374529 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.395482 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.418467 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.429042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.429104 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.429122 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.429151 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.429169 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.435349 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.462138 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.533360 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.533415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.533428 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.533446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.533473 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.636788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.636848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.636862 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.636887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.636903 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.739706 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.739780 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.739794 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.739816 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.739830 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.781848 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.784447 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.784973 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.785203 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.803198 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.817892 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.818319 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.830658 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.842040 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.842085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.842099 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.842120 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.842136 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.848578 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.867153 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.880694 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.897920 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.911327 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.929087 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.942534 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.944490 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.944523 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.944535 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.944553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.944565 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:24Z","lastTransitionTime":"2026-03-10T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.955579 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.968026 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:24 crc kubenswrapper[4911]: I0310 14:03:24.989038 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.003141 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:24Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.024292 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.046944 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.047598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.047633 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.047646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.047668 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.047682 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.063749 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.075514 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.088812 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.100292 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.119330 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.129645 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.138607 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.150344 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.150383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.150395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.150412 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.150426 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.152682 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.162813 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.178780 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.192478 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:25 crc kubenswrapper[4911]: E0310 14:03:25.192626 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.192963 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.208518 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.246822 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.274068 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.274115 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.274126 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.274144 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.274155 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.281166 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.303945 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.324341 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.337031 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.350545 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.377465 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.377518 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.377531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.377553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.377570 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.480139 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.480177 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.480186 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.480199 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.480208 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.582379 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.582490 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.582526 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.582544 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.582556 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.686189 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.686234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.686245 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.686264 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.686276 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.788505 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.788573 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.788588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.788613 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.788628 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.792026 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/0.log" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.796335 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db" exitCode=1 Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.796384 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.797855 4911 scope.go:117] "RemoveContainer" containerID="1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.814072 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.831873 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.864533 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:25Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:25.642284 6765 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 14:03:25.642336 6765 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 14:03:25.642342 6765 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 14:03:25.642376 6765 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:25.642451 6765 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 14:03:25.642462 6765 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 14:03:25.642485 6765 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 14:03:25.642539 6765 factory.go:656] Stopping watch factory\\\\nI0310 14:03:25.642544 6765 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 14:03:25.642545 6765 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 14:03:25.642557 6765 ovnkube.go:599] Stopped ovnkube\\\\nI0310 14:03:25.642557 6765 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 14:03:25.642571 6765 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:25.642569 6765 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 14:03:25.642591 6765 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:2910\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.888073 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.890713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.890813 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.890828 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.890852 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.890869 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.915075 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.932759 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.948543 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.960624 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.974862 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.993512 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:25Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.993804 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.993822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.993833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.993853 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:25 crc kubenswrapper[4911]: I0310 14:03:25.993864 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:25Z","lastTransitionTime":"2026-03-10T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.016516 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.038434 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.052140 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.068758 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.082276 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.096353 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.096569 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.096604 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.096615 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.096629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.096640 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:26Z","lastTransitionTime":"2026-03-10T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.110556 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.193031 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.193279 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.193364 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:26 crc kubenswrapper[4911]: E0310 14:03:26.193357 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:26 crc kubenswrapper[4911]: E0310 14:03:26.193497 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:26 crc kubenswrapper[4911]: E0310 14:03:26.193539 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.198677 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.198733 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.198746 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.198760 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.198770 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:26Z","lastTransitionTime":"2026-03-10T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.209939 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.212421 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.228712 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.244768 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.263254 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.540557 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:25Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:25.642284 6765 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 14:03:25.642336 6765 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 14:03:25.642342 6765 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 14:03:25.642376 6765 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:25.642451 6765 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 14:03:25.642462 6765 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 14:03:25.642485 6765 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 14:03:25.642539 6765 factory.go:656] Stopping watch factory\\\\nI0310 14:03:25.642544 6765 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 14:03:25.642545 6765 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 14:03:25.642557 6765 ovnkube.go:599] Stopped ovnkube\\\\nI0310 14:03:25.642557 6765 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 14:03:25.642571 6765 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:25.642569 6765 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 14:03:25.642591 6765 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:2910\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.544321 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.544412 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.544423 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.544440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.544822 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:26Z","lastTransitionTime":"2026-03-10T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.569686 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.603086 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.620208 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.639642 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.647805 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.647856 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.647871 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.647898 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.647914 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:26Z","lastTransitionTime":"2026-03-10T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.657809 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.674512 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.696548 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.715510 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.734545 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.750928 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.752417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.752485 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.752518 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.752546 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.752559 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:26Z","lastTransitionTime":"2026-03-10T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.775166 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.801931 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.804408 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/0.log" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.808696 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.809621 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.836282 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.852773 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.855058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.855106 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.855119 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.855143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.855157 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:26Z","lastTransitionTime":"2026-03-10T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.870368 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.888777 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.901416 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.915227 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.942069 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:25Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:25.642284 6765 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 14:03:25.642336 6765 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 14:03:25.642342 6765 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 14:03:25.642376 6765 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:25.642451 6765 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 14:03:25.642462 6765 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 14:03:25.642485 6765 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 14:03:25.642539 6765 factory.go:656] Stopping watch factory\\\\nI0310 14:03:25.642544 6765 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 14:03:25.642545 6765 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 14:03:25.642557 6765 ovnkube.go:599] Stopped ovnkube\\\\nI0310 14:03:25.642557 6765 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 14:03:25.642571 6765 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:25.642569 6765 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 14:03:25.642591 6765 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:2910\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.959205 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.959263 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.959276 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.959298 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.959312 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:26Z","lastTransitionTime":"2026-03-10T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.960062 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.973046 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:26 crc kubenswrapper[4911]: I0310 14:03:26.996351 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.010169 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.020029 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.020075 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.020086 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.020106 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.020121 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.032989 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: E0310 14:03:27.039856 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.044287 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.044460 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.044484 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.044511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.044531 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.046837 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.060105 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: E0310 14:03:27.066179 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.070052 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.070073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.070081 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.070099 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.070112 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.078149 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: E0310 14:03:27.082973 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.088036 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.088221 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.088342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.088466 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.088625 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.100775 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: E0310 14:03:27.102316 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.106522 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.106558 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.106570 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.106602 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.106617 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.114500 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: E0310 14:03:27.121034 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: E0310 14:03:27.121273 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.123717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.123776 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.123792 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.123818 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.123836 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.129332 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.193016 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:27 crc kubenswrapper[4911]: E0310 14:03:27.193292 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.227445 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.227510 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.227532 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.227559 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.227580 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.332402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.332479 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.332500 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.332528 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.332547 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.436588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.436663 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.436686 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.436714 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.436764 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.540809 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.540943 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.540970 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.541005 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.541034 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.645108 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.645170 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.645185 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.645213 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.645229 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.749187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.749267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.749289 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.749322 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.749347 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.817543 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/1.log" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.818705 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/0.log" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.823288 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7" exitCode=1 Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.823346 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.823398 4911 scope.go:117] "RemoveContainer" containerID="1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.824819 4911 scope.go:117] "RemoveContainer" containerID="6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7" Mar 10 14:03:27 crc kubenswrapper[4911]: E0310 14:03:27.825172 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.848047 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.852552 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.852785 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.853038 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.853202 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.853355 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.876476 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.900686 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.926107 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.944976 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.957169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.957219 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.957232 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.957255 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.957269 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:27Z","lastTransitionTime":"2026-03-10T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:27 crc kubenswrapper[4911]: I0310 14:03:27.982031 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f399b691063c0e2acfc55407d70259f23a092323bd81abc3a51b1c5365c46db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:25Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:25.642284 6765 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 14:03:25.642336 6765 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 14:03:25.642342 6765 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 14:03:25.642376 6765 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:25.642451 6765 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 14:03:25.642462 6765 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 14:03:25.642485 6765 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 14:03:25.642539 6765 factory.go:656] Stopping watch factory\\\\nI0310 14:03:25.642544 6765 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 14:03:25.642545 6765 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 14:03:25.642557 6765 ovnkube.go:599] Stopped ovnkube\\\\nI0310 14:03:25.642557 6765 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 14:03:25.642571 6765 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:25.642569 6765 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 14:03:25.642591 6765 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:2910\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"minNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001313 6921 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:27.001330 6921 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:27.001407 6921 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 14:03:27.001431 6921 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 14:03:27.000706 6921 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001512 6921 factory.go:656] Stopping watch factory\\\\nI0310 14:03:27.001558 6921 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 14:03:27.001584 6921 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:27.001603 6921 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 14:03:27.001860 6921 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 14:03:27.002108 6921 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:27Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.006765 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.042455 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.066456 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.066541 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.066563 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.066627 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.066649 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.070663 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.091705 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.109912 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.132299 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.151935 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.164631 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.169517 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.169793 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.169837 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.169877 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.169904 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.175686 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.191557 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.193264 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.193388 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:28 crc kubenswrapper[4911]: E0310 14:03:28.193423 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.193516 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:28 crc kubenswrapper[4911]: E0310 14:03:28.193589 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:28 crc kubenswrapper[4911]: E0310 14:03:28.193719 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.210239 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.236083 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.273167 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.273228 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.273256 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.273286 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.273308 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.376446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.376501 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.376519 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.376547 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.376567 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.480161 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.480229 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.480250 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.480277 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.480296 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.583929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.584004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.584024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.584059 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.584085 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.687355 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.687441 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.687465 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.687499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.687529 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.790977 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.791027 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.791038 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.791056 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.791069 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.830569 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/1.log" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.835838 4911 scope.go:117] "RemoveContainer" containerID="6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7" Mar 10 14:03:28 crc kubenswrapper[4911]: E0310 14:03:28.836044 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.862166 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.886022 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.894375 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.894436 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.894452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.894474 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.894490 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.903701 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.922108 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.942587 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.958300 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.974292 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.991633 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:28Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.998087 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.998161 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.998179 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.998209 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:28 crc kubenswrapper[4911]: I0310 14:03:28.998232 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:28Z","lastTransitionTime":"2026-03-10T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.014008 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.038239 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.063665 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.083416 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.100194 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.102631 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.102718 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.102776 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.102818 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.102846 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.135231 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"minNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001313 6921 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:27.001330 6921 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:27.001407 6921 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 14:03:27.001431 6921 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 14:03:27.000706 6921 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001512 6921 factory.go:656] Stopping watch factory\\\\nI0310 14:03:27.001558 6921 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 14:03:27.001584 6921 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:27.001603 6921 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 14:03:27.001860 6921 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 14:03:27.002108 6921 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.155571 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.189172 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.192414 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:29 crc kubenswrapper[4911]: E0310 14:03:29.192792 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.208215 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.208294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.208332 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.208364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.208388 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.214591 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.235400 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:29Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.312549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.312624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.312642 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.312682 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.312703 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.415451 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.415522 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.415543 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.415570 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.415593 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.519620 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.519709 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.519764 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.519799 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.519823 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.623992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.624075 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.624097 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.624127 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.624148 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.727653 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.727758 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.727835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.727864 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.727884 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.830913 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.830992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.831011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.831043 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.831068 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.934596 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.934664 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.934681 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.934714 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:29 crc kubenswrapper[4911]: I0310 14:03:29.934761 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:29Z","lastTransitionTime":"2026-03-10T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.039383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.039455 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.039472 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.039499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.039518 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.142882 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.142994 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.143019 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.143055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.143080 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.192844 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.192977 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.192856 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:30 crc kubenswrapper[4911]: E0310 14:03:30.193134 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:30 crc kubenswrapper[4911]: E0310 14:03:30.193318 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:30 crc kubenswrapper[4911]: E0310 14:03:30.193569 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.246697 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.246768 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.246781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.246804 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.246822 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.349272 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.349350 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.349372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.349403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.349427 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.453453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.453524 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.453545 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.453576 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.453596 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.557087 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.557158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.557173 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.557195 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.557209 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.662301 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.662366 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.662381 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.662407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.662424 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.766856 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.766925 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.766941 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.766967 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.766989 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.870633 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.870714 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.870763 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.870799 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.870828 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.974348 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.974415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.974434 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.974464 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:30 crc kubenswrapper[4911]: I0310 14:03:30.974560 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:30Z","lastTransitionTime":"2026-03-10T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.077840 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.077904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.077922 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.077949 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.077968 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.181585 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.181633 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.181653 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.181679 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.181698 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.192362 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:31 crc kubenswrapper[4911]: E0310 14:03:31.192639 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.285853 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.285941 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.285958 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.285988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.286006 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.389581 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.389649 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.389667 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.389698 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.389718 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.493776 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.493839 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.493851 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.493873 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.493885 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.596412 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.596443 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.596451 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.596464 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.596473 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.699943 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.700000 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.700011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.700031 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.700049 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.802873 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.802934 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.802946 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.802965 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.802979 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.906511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.906642 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.906713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.906758 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:31 crc kubenswrapper[4911]: I0310 14:03:31.906769 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:31Z","lastTransitionTime":"2026-03-10T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.011630 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.011703 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.011764 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.011794 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.011818 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.115681 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.115813 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.115832 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.115863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.115883 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.193171 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.193250 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.193296 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:32 crc kubenswrapper[4911]: E0310 14:03:32.193476 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:32 crc kubenswrapper[4911]: E0310 14:03:32.193641 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:32 crc kubenswrapper[4911]: E0310 14:03:32.193922 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.219599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.219648 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.219665 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.219689 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.219706 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.323332 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.323417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.323438 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.323468 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.323489 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.427608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.427678 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.427695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.427754 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.427773 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.532657 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.532804 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.532825 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.532854 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.532877 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.636864 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.637415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.637434 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.637464 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.637484 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.741050 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.741099 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.741112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.741128 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.741143 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.845146 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.845225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.845242 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.845276 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.845303 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.950267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.950354 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.950373 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.950411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:32 crc kubenswrapper[4911]: I0310 14:03:32.950435 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:32Z","lastTransitionTime":"2026-03-10T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.054014 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.054106 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.054130 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.054165 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.054185 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.158248 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.158362 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.158388 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.158417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.158437 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.193015 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:33 crc kubenswrapper[4911]: E0310 14:03:33.193209 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.262138 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.262244 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.262274 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.262317 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.262345 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.366022 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.366074 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.366088 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.366112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.366126 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.471135 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.471206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.471224 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.471250 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.471268 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.574992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.575104 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.575159 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.575187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.575239 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.678590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.678657 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.678669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.678691 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.678705 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.724104 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.782956 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.783058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.783082 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.783114 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.783138 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.886350 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.886414 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.886438 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.886469 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.886492 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.989604 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.989708 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.989777 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.989814 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:33 crc kubenswrapper[4911]: I0310 14:03:33.989839 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:33Z","lastTransitionTime":"2026-03-10T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.092859 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.092925 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.092945 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.092971 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.092992 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.192391 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.192440 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.192677 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:34 crc kubenswrapper[4911]: E0310 14:03:34.192775 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:34 crc kubenswrapper[4911]: E0310 14:03:34.192975 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:34 crc kubenswrapper[4911]: E0310 14:03:34.193381 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.196833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.197040 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.197086 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.197119 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.197145 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.301762 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.301829 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.301853 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.301890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.301914 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.405863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.405937 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.405959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.405987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.406009 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.509999 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.510085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.510112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.510152 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.510178 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.616073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.616646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.616938 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.617168 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.617381 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.720849 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.720904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.720922 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.720950 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.720968 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.824555 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.824624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.824646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.824675 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.824697 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.928004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.928068 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.928090 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.928117 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:34 crc kubenswrapper[4911]: I0310 14:03:34.928137 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:34Z","lastTransitionTime":"2026-03-10T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.031343 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.031413 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.031433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.031466 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.031486 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.135938 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.136024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.136044 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.136073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.136092 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.192662 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:35 crc kubenswrapper[4911]: E0310 14:03:35.193312 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.211083 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.241032 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.241106 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.241128 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.241157 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.241177 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.345489 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.345552 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.345569 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.345602 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.345622 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.448956 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.449017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.449036 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.449063 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.449081 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.552940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.553007 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.553025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.553051 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.553071 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.656619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.656701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.656746 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.656776 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.656797 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.760095 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.760176 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.760195 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.760221 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.760240 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.863414 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.863503 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.863520 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.863543 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.863562 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.966351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.966414 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.966431 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.966459 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:35 crc kubenswrapper[4911]: I0310 14:03:35.966477 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:35Z","lastTransitionTime":"2026-03-10T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.068980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.069057 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.069078 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.069106 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.069130 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:36Z","lastTransitionTime":"2026-03-10T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:36 crc kubenswrapper[4911]: E0310 14:03:36.169917 4911 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.192468 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.192509 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:36 crc kubenswrapper[4911]: E0310 14:03:36.192696 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.192991 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:36 crc kubenswrapper[4911]: E0310 14:03:36.194859 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:36 crc kubenswrapper[4911]: E0310 14:03:36.195096 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.214378 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.239218 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.264228 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.286850 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.312883 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.337417 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.355694 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.388270 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"minNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001313 6921 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:27.001330 6921 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:27.001407 6921 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 14:03:27.001431 6921 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 14:03:27.000706 6921 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001512 6921 factory.go:656] Stopping watch factory\\\\nI0310 14:03:27.001558 6921 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 14:03:27.001584 6921 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:27.001603 6921 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 14:03:27.001860 6921 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 14:03:27.002108 6921 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.409449 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.426086 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.462663 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.484658 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.508960 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.530870 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: E0310 14:03:36.545706 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.561490 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.593599 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.621345 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.645946 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:36 crc kubenswrapper[4911]: I0310 14:03:36.669323 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:36Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.192859 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:37 crc kubenswrapper[4911]: E0310 14:03:37.193120 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.424094 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.424177 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.424202 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.424238 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.424264 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:37Z","lastTransitionTime":"2026-03-10T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:37 crc kubenswrapper[4911]: E0310 14:03:37.446209 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:37Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.452301 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.452356 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.452374 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.452406 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.452426 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:37Z","lastTransitionTime":"2026-03-10T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:37 crc kubenswrapper[4911]: E0310 14:03:37.471617 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:37Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.477435 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.477675 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.477893 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.478093 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.478252 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:37Z","lastTransitionTime":"2026-03-10T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:37 crc kubenswrapper[4911]: E0310 14:03:37.514461 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:37Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.523257 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.523507 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.523703 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.523921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.524050 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:37Z","lastTransitionTime":"2026-03-10T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:37 crc kubenswrapper[4911]: E0310 14:03:37.545496 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:37Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.561566 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.561632 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.561649 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.561680 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:37 crc kubenswrapper[4911]: I0310 14:03:37.561699 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:37Z","lastTransitionTime":"2026-03-10T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:37 crc kubenswrapper[4911]: E0310 14:03:37.581489 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:37Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:37 crc kubenswrapper[4911]: E0310 14:03:37.581631 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:03:38 crc kubenswrapper[4911]: I0310 14:03:38.192596 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:38 crc kubenswrapper[4911]: I0310 14:03:38.192589 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:38 crc kubenswrapper[4911]: E0310 14:03:38.192873 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:38 crc kubenswrapper[4911]: I0310 14:03:38.192910 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:38 crc kubenswrapper[4911]: E0310 14:03:38.193050 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:38 crc kubenswrapper[4911]: E0310 14:03:38.193336 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:39 crc kubenswrapper[4911]: I0310 14:03:39.193347 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:39 crc kubenswrapper[4911]: E0310 14:03:39.193609 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.192667 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.192675 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.192749 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:40 crc kubenswrapper[4911]: E0310 14:03:40.192937 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:40 crc kubenswrapper[4911]: E0310 14:03:40.193130 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:40 crc kubenswrapper[4911]: E0310 14:03:40.193784 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.194230 4911 scope.go:117] "RemoveContainer" containerID="6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.890537 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/1.log" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.895016 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50"} Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.895952 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.916032 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:40Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.930930 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:40Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.957397 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:40Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.972402 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:40Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:40 crc kubenswrapper[4911]: I0310 14:03:40.987543 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:40Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.002486 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:40Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.018458 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.040160 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.058265 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.075150 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.089477 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.106719 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.123150 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.138854 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.157123 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.172001 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.192173 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.192618 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:41 crc kubenswrapper[4911]: E0310 14:03:41.192817 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.213229 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.239283 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"minNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001313 6921 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:27.001330 6921 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:27.001407 6921 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 14:03:27.001431 6921 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 14:03:27.000706 6921 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001512 6921 factory.go:656] Stopping watch factory\\\\nI0310 14:03:27.001558 6921 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 14:03:27.001584 6921 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:27.001603 6921 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 14:03:27.001860 6921 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 14:03:27.002108 6921 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: E0310 14:03:41.547839 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.902083 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/2.log" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.903065 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/1.log" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.906460 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50" exitCode=1 Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.906572 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50"} Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.906806 4911 scope.go:117] "RemoveContainer" containerID="6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.907798 4911 scope.go:117] "RemoveContainer" containerID="59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50" Mar 10 14:03:41 crc kubenswrapper[4911]: E0310 14:03:41.908141 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.924532 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.945836 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.960952 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.973761 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:41 crc kubenswrapper[4911]: I0310 14:03:41.989100 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.011069 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.035534 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.051027 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.070523 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.090418 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.110537 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.128409 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.142675 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.176508 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f57ddbdf477110264b0b72d90752b43a6bb453fc9b003db2fadf2432c6f7da7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:27Z\\\",\\\"message\\\":\\\"minNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001313 6921 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 14:03:27.001330 6921 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 14:03:27.001407 6921 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 14:03:27.001431 6921 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 14:03:27.000706 6921 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 14:03:27.001512 6921 factory.go:656] Stopping watch factory\\\\nI0310 14:03:27.001558 6921 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 14:03:27.001584 6921 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 14:03:27.001603 6921 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 14:03:27.001860 6921 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 14:03:27.002108 6921 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.193404 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.193489 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:42 crc kubenswrapper[4911]: E0310 14:03:42.193581 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:42 crc kubenswrapper[4911]: E0310 14:03:42.194007 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.194156 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:42 crc kubenswrapper[4911]: E0310 14:03:42.194288 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.194242 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.228989 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.250479 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.268565 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.286070 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.913513 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/2.log" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.919822 4911 scope.go:117] "RemoveContainer" containerID="59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50" Mar 10 14:03:42 crc kubenswrapper[4911]: E0310 14:03:42.920174 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.940480 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:42 crc kubenswrapper[4911]: I0310 14:03:42.979483 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.003113 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:42Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.023784 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.043186 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.071962 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.101617 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.121456 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.142992 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.161347 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.176443 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.192648 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:43 crc kubenswrapper[4911]: E0310 14:03:43.192928 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.199930 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.216520 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.237993 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.255226 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.272240 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.288968 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.305183 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.330894 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.731193 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.755074 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.773794 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.807277 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.828395 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.846849 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.861438 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.877599 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.900863 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.923064 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.946329 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.970947 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:43 crc kubenswrapper[4911]: I0310 14:03:43.988341 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:43Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.008392 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:44Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.027503 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:44Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.046251 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:44Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.065876 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:44Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.084826 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:44Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.102364 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:44Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.145253 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:44Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.192660 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.192659 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:44 crc kubenswrapper[4911]: I0310 14:03:44.192819 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:44 crc kubenswrapper[4911]: E0310 14:03:44.193102 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:44 crc kubenswrapper[4911]: E0310 14:03:44.193211 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:44 crc kubenswrapper[4911]: E0310 14:03:44.193608 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:45 crc kubenswrapper[4911]: I0310 14:03:45.192341 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:45 crc kubenswrapper[4911]: E0310 14:03:45.192605 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.193053 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.193134 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.193053 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:46 crc kubenswrapper[4911]: E0310 14:03:46.193306 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:46 crc kubenswrapper[4911]: E0310 14:03:46.193487 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:46 crc kubenswrapper[4911]: E0310 14:03:46.193590 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.213501 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.238297 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.260007 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.283630 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.302369 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.318262 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.349702 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.368867 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.386119 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.407383 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.429611 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.452360 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.472445 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.495855 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.521105 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.544941 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: E0310 14:03:46.549239 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.565266 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.583797 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:46 crc kubenswrapper[4911]: I0310 14:03:46.624960 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:46Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.193099 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:47 crc kubenswrapper[4911]: E0310 14:03:47.193355 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.617125 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.617258 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.617281 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.617313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.617336 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:47Z","lastTransitionTime":"2026-03-10T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:47 crc kubenswrapper[4911]: E0310 14:03:47.642469 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:47Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.650511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.650571 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.650590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.650622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.650641 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:47Z","lastTransitionTime":"2026-03-10T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:47 crc kubenswrapper[4911]: E0310 14:03:47.673819 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:47Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.681807 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.681884 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.681906 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.681938 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.681966 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:47Z","lastTransitionTime":"2026-03-10T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:47 crc kubenswrapper[4911]: E0310 14:03:47.708824 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:47Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.715534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.715593 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.715607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.715629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.715644 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:47Z","lastTransitionTime":"2026-03-10T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:47 crc kubenswrapper[4911]: E0310 14:03:47.737261 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:47Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.742163 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.742231 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.742257 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.742288 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:47 crc kubenswrapper[4911]: I0310 14:03:47.742312 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:47Z","lastTransitionTime":"2026-03-10T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:47 crc kubenswrapper[4911]: E0310 14:03:47.768332 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:47Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:47 crc kubenswrapper[4911]: E0310 14:03:47.768574 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:03:48 crc kubenswrapper[4911]: I0310 14:03:48.192836 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:48 crc kubenswrapper[4911]: I0310 14:03:48.192900 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:48 crc kubenswrapper[4911]: I0310 14:03:48.192836 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:48 crc kubenswrapper[4911]: E0310 14:03:48.193099 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:48 crc kubenswrapper[4911]: E0310 14:03:48.193221 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:48 crc kubenswrapper[4911]: E0310 14:03:48.193530 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:49 crc kubenswrapper[4911]: I0310 14:03:49.193087 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:49 crc kubenswrapper[4911]: E0310 14:03:49.193845 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:50 crc kubenswrapper[4911]: I0310 14:03:50.194124 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:50 crc kubenswrapper[4911]: I0310 14:03:50.194171 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:50 crc kubenswrapper[4911]: I0310 14:03:50.194177 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:50 crc kubenswrapper[4911]: E0310 14:03:50.194979 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:50 crc kubenswrapper[4911]: E0310 14:03:50.195000 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:50 crc kubenswrapper[4911]: E0310 14:03:50.194987 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:50 crc kubenswrapper[4911]: I0310 14:03:50.956962 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/0.log" Mar 10 14:03:50 crc kubenswrapper[4911]: I0310 14:03:50.957069 4911 generic.go:334] "Generic (PLEG): container finished" podID="fc662696-d402-4969-bebd-00fa42e63075" containerID="6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54" exitCode=1 Mar 10 14:03:50 crc kubenswrapper[4911]: I0310 14:03:50.957123 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nsxjn" event={"ID":"fc662696-d402-4969-bebd-00fa42e63075","Type":"ContainerDied","Data":"6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54"} Mar 10 14:03:50 crc kubenswrapper[4911]: I0310 14:03:50.957905 4911 scope.go:117] "RemoveContainer" containerID="6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54" Mar 10 14:03:50 crc kubenswrapper[4911]: I0310 14:03:50.980423 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:50Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.000412 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:50Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.030681 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.047231 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.077259 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.105810 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.131932 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.152586 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.180300 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.192301 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:51 crc kubenswrapper[4911]: E0310 14:03:51.192563 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.199548 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.220701 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.242329 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.258236 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.276078 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.298331 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.319410 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.339878 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.357936 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.377075 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:51 crc kubenswrapper[4911]: E0310 14:03:51.550759 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.964651 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/0.log" Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.965216 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nsxjn" event={"ID":"fc662696-d402-4969-bebd-00fa42e63075","Type":"ContainerStarted","Data":"434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09"} Mar 10 14:03:51 crc kubenswrapper[4911]: I0310 14:03:51.990380 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:51Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.011836 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.030781 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.047277 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.071097 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.071474 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.071849 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.071572 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.071974 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.071927 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072059 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:04:56.072034797 +0000 UTC m=+200.635554744 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072099 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.072280 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072302 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072336 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072400 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:04:56.072363405 +0000 UTC m=+200.635883352 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072416 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072457 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072483 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072522 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:04:56.072479988 +0000 UTC m=+200.635999935 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.072568 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:04:56.07254363 +0000 UTC m=+200.636063667 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.110267 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.135232 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.159630 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.176382 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.192644 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.192677 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.192753 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.192883 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.193000 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.193100 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.198626 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.222973 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.245600 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.274368 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.274577 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:04:56.274547164 +0000 UTC m=+200.838067111 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.274708 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.274872 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: E0310 14:03:52.274948 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:04:56.274930134 +0000 UTC m=+200.838450071 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.279642 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.303235 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.314904 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.329536 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.345608 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.361235 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:52 crc kubenswrapper[4911]: I0310 14:03:52.388492 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:52Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:53 crc kubenswrapper[4911]: I0310 14:03:53.193196 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:53 crc kubenswrapper[4911]: E0310 14:03:53.193458 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:54 crc kubenswrapper[4911]: I0310 14:03:54.193480 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:54 crc kubenswrapper[4911]: I0310 14:03:54.193536 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:54 crc kubenswrapper[4911]: E0310 14:03:54.193791 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:54 crc kubenswrapper[4911]: I0310 14:03:54.193886 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:54 crc kubenswrapper[4911]: E0310 14:03:54.194109 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:54 crc kubenswrapper[4911]: E0310 14:03:54.194180 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:55 crc kubenswrapper[4911]: I0310 14:03:55.193202 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:55 crc kubenswrapper[4911]: E0310 14:03:55.193465 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.193324 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.193456 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:56 crc kubenswrapper[4911]: E0310 14:03:56.193697 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:56 crc kubenswrapper[4911]: E0310 14:03:56.193911 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.194552 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:56 crc kubenswrapper[4911]: E0310 14:03:56.194834 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.211826 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.255493 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.280508 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.301348 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.318909 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.368478 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.390845 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.410465 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.427335 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.442580 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.453949 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.469485 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.484633 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.503772 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.523034 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.543257 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: E0310 14:03:56.551660 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.563540 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.581887 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:56 crc kubenswrapper[4911]: I0310 14:03:56.605861 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:56Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.193218 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:57 crc kubenswrapper[4911]: E0310 14:03:57.193393 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.194760 4911 scope.go:117] "RemoveContainer" containerID="59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50" Mar 10 14:03:57 crc kubenswrapper[4911]: E0310 14:03:57.195054 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.868433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.868517 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.868542 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.868572 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.868594 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:57Z","lastTransitionTime":"2026-03-10T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:57 crc kubenswrapper[4911]: E0310 14:03:57.891443 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:57Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.897080 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.897129 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.897138 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.897155 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.897168 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:57Z","lastTransitionTime":"2026-03-10T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:57 crc kubenswrapper[4911]: E0310 14:03:57.916387 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:57Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.921939 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.921984 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.922022 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.922046 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.922061 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:57Z","lastTransitionTime":"2026-03-10T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:57 crc kubenswrapper[4911]: E0310 14:03:57.942654 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:57Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.949581 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.949826 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.949887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.949915 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.949937 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:57Z","lastTransitionTime":"2026-03-10T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:57 crc kubenswrapper[4911]: E0310 14:03:57.973529 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:57Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.981453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.981822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.982015 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.982234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:03:57 crc kubenswrapper[4911]: I0310 14:03:57.982443 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:03:57Z","lastTransitionTime":"2026-03-10T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:03:57 crc kubenswrapper[4911]: E0310 14:03:57.997379 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:57Z is after 2025-08-24T17:21:41Z" Mar 10 14:03:57 crc kubenswrapper[4911]: E0310 14:03:57.997559 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:03:58 crc kubenswrapper[4911]: I0310 14:03:58.192830 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:03:58 crc kubenswrapper[4911]: I0310 14:03:58.192832 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:03:58 crc kubenswrapper[4911]: E0310 14:03:58.193045 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:03:58 crc kubenswrapper[4911]: E0310 14:03:58.193106 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:03:58 crc kubenswrapper[4911]: I0310 14:03:58.192864 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:03:58 crc kubenswrapper[4911]: E0310 14:03:58.193685 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:03:59 crc kubenswrapper[4911]: I0310 14:03:59.193251 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:03:59 crc kubenswrapper[4911]: E0310 14:03:59.193491 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:00 crc kubenswrapper[4911]: I0310 14:04:00.192366 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:00 crc kubenswrapper[4911]: I0310 14:04:00.192381 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:00 crc kubenswrapper[4911]: E0310 14:04:00.192597 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:00 crc kubenswrapper[4911]: I0310 14:04:00.192684 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:00 crc kubenswrapper[4911]: E0310 14:04:00.192925 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:00 crc kubenswrapper[4911]: E0310 14:04:00.193052 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:01 crc kubenswrapper[4911]: I0310 14:04:01.193353 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:01 crc kubenswrapper[4911]: E0310 14:04:01.193684 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:01 crc kubenswrapper[4911]: E0310 14:04:01.553989 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:02 crc kubenswrapper[4911]: I0310 14:04:02.193252 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:02 crc kubenswrapper[4911]: I0310 14:04:02.193399 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:02 crc kubenswrapper[4911]: E0310 14:04:02.193453 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:02 crc kubenswrapper[4911]: I0310 14:04:02.193490 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:02 crc kubenswrapper[4911]: E0310 14:04:02.193603 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:02 crc kubenswrapper[4911]: E0310 14:04:02.193785 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:03 crc kubenswrapper[4911]: I0310 14:04:03.193051 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:03 crc kubenswrapper[4911]: E0310 14:04:03.193288 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:04 crc kubenswrapper[4911]: I0310 14:04:04.192851 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:04 crc kubenswrapper[4911]: I0310 14:04:04.192964 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:04 crc kubenswrapper[4911]: E0310 14:04:04.193077 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:04 crc kubenswrapper[4911]: E0310 14:04:04.193186 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:04 crc kubenswrapper[4911]: I0310 14:04:04.193319 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:04 crc kubenswrapper[4911]: E0310 14:04:04.193500 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:05 crc kubenswrapper[4911]: I0310 14:04:05.192522 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:05 crc kubenswrapper[4911]: E0310 14:04:05.192793 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.193304 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.193406 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:06 crc kubenswrapper[4911]: E0310 14:04:06.193469 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:06 crc kubenswrapper[4911]: E0310 14:04:06.193608 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.193828 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:06 crc kubenswrapper[4911]: E0310 14:04:06.193947 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.212337 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.225210 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.235476 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.253820 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.277486 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.294131 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.312441 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.327755 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.341846 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.360501 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.374443 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.394090 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.414935 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.436368 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.460424 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.478699 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.492577 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.519179 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: I0310 14:04:06.538886 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:06Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:06 crc kubenswrapper[4911]: E0310 14:04:06.555605 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:07 crc kubenswrapper[4911]: I0310 14:04:07.193189 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:07 crc kubenswrapper[4911]: E0310 14:04:07.193582 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.192548 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.192641 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.192664 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.192849 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.193090 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.193257 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.299930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.300037 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.300055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.300075 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.300089 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:08Z","lastTransitionTime":"2026-03-10T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.321763 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:08Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.326614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.326655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.326667 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.326686 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.326702 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:08Z","lastTransitionTime":"2026-03-10T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.346448 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:08Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.351618 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.351680 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.351695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.351716 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.351770 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:08Z","lastTransitionTime":"2026-03-10T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.371210 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:08Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.376797 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.376894 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.376921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.376956 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.376976 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:08Z","lastTransitionTime":"2026-03-10T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.398765 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:08Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.404187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.404254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.404275 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.404300 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:08 crc kubenswrapper[4911]: I0310 14:04:08.404319 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:08Z","lastTransitionTime":"2026-03-10T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.424138 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:08Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:08 crc kubenswrapper[4911]: E0310 14:04:08.424379 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:04:09 crc kubenswrapper[4911]: I0310 14:04:09.192588 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:09 crc kubenswrapper[4911]: E0310 14:04:09.192858 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:09 crc kubenswrapper[4911]: I0310 14:04:09.194199 4911 scope.go:117] "RemoveContainer" containerID="59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.046116 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/2.log" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.050038 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a"} Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.050684 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.062184 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.077163 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.090559 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.109584 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.128499 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.143797 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.158763 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.172927 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.192184 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.192824 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.192822 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:10 crc kubenswrapper[4911]: E0310 14:04:10.192994 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:10 crc kubenswrapper[4911]: E0310 14:04:10.193112 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.192822 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:10 crc kubenswrapper[4911]: E0310 14:04:10.193238 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.211645 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.227863 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.241447 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.252032 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.275699 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.291462 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.306867 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.336211 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.350673 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:10 crc kubenswrapper[4911]: I0310 14:04:10.363663 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.057583 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/3.log" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.059261 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/2.log" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.064081 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a" exitCode=1 Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.064159 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a"} Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.064220 4911 scope.go:117] "RemoveContainer" containerID="59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.065643 4911 scope.go:117] "RemoveContainer" containerID="79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a" Mar 10 14:04:11 crc kubenswrapper[4911]: E0310 14:04:11.066080 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.088120 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.107288 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.129471 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.146137 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.162978 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.184177 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.193185 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:11 crc kubenswrapper[4911]: E0310 14:04:11.193387 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.198606 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.223604 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f3eb98143cf905710d649fb530582c6b878a5a3c51ca46ca86f97d5d8d6a50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:41Z\\\",\\\"message\\\":\\\"d to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:03:41Z is after 2025-08-24T17:21:41Z]\\\\nI0310 14:03:41.200640 7114 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:04:10Z\\\",\\\"message\\\":\\\" 1 for removal\\\\nI0310 14:04:10.207822 7428 factory.go:656] Stopping watch factory\\\\nI0310 14:04:10.207835 7428 ovnkube.go:599] Stopped ovnkube\\\\nI0310 14:04:10.207812 7428 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 14:04:10.207867 7428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 14:04:10.207864 7428 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 14:04:10.207869 7428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 14:04:10.207914 7428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nF0310 14:04:10.207937 7428 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.240186 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.265280 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.280349 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.295617 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.314404 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.334147 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.352004 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.370227 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.388434 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.404333 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: I0310 14:04:11.417223 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:11 crc kubenswrapper[4911]: E0310 14:04:11.557758 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.074096 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/3.log" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.079344 4911 scope.go:117] "RemoveContainer" containerID="79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a" Mar 10 14:04:12 crc kubenswrapper[4911]: E0310 14:04:12.079617 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.097666 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.116015 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.146425 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:04:10Z\\\",\\\"message\\\":\\\" 1 for removal\\\\nI0310 14:04:10.207822 7428 factory.go:656] Stopping watch factory\\\\nI0310 14:04:10.207835 7428 ovnkube.go:599] Stopped ovnkube\\\\nI0310 14:04:10.207812 7428 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 14:04:10.207867 7428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 14:04:10.207864 7428 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 14:04:10.207869 7428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 14:04:10.207914 7428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nF0310 14:04:10.207937 7428 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:04:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.165793 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.184986 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.192462 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.192490 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.192552 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:12 crc kubenswrapper[4911]: E0310 14:04:12.192635 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:12 crc kubenswrapper[4911]: E0310 14:04:12.192958 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:12 crc kubenswrapper[4911]: E0310 14:04:12.193032 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.222844 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.241425 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.259431 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.275572 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.298698 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.324682 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.351145 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.366874 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.384307 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.399502 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.419932 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.439543 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.457517 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:12 crc kubenswrapper[4911]: I0310 14:04:12.474421 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:12Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:13 crc kubenswrapper[4911]: I0310 14:04:13.192900 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:13 crc kubenswrapper[4911]: E0310 14:04:13.193114 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:14 crc kubenswrapper[4911]: I0310 14:04:14.192881 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:14 crc kubenswrapper[4911]: I0310 14:04:14.193088 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:14 crc kubenswrapper[4911]: I0310 14:04:14.194026 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:14 crc kubenswrapper[4911]: E0310 14:04:14.194260 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:14 crc kubenswrapper[4911]: E0310 14:04:14.194438 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:14 crc kubenswrapper[4911]: E0310 14:04:14.194564 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:15 crc kubenswrapper[4911]: I0310 14:04:15.192562 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:15 crc kubenswrapper[4911]: E0310 14:04:15.192825 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.193243 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.193398 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:16 crc kubenswrapper[4911]: E0310 14:04:16.193473 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:16 crc kubenswrapper[4911]: E0310 14:04:16.193646 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.193866 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:16 crc kubenswrapper[4911]: E0310 14:04:16.194177 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.217593 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.237421 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.273090 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:04:10Z\\\",\\\"message\\\":\\\" 1 for removal\\\\nI0310 14:04:10.207822 7428 factory.go:656] Stopping watch factory\\\\nI0310 14:04:10.207835 7428 ovnkube.go:599] Stopped ovnkube\\\\nI0310 14:04:10.207812 7428 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 14:04:10.207867 7428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 14:04:10.207864 7428 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 14:04:10.207869 7428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 14:04:10.207914 7428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nF0310 14:04:10.207937 7428 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:04:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.287316 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.316924 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.338788 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.357956 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.381642 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.401083 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.420774 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.440718 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.456281 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.474565 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.496647 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.516458 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.538173 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: E0310 14:04:16.558802 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.560858 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.582286 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:16 crc kubenswrapper[4911]: I0310 14:04:16.602391 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:17 crc kubenswrapper[4911]: I0310 14:04:17.192665 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:17 crc kubenswrapper[4911]: E0310 14:04:17.192973 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.192831 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.192839 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.192855 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.193069 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.193270 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.193428 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.578435 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.578509 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.578531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.578559 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.578579 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:18Z","lastTransitionTime":"2026-03-10T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.593981 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.599057 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.599137 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.599162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.599195 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.599219 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:18Z","lastTransitionTime":"2026-03-10T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.618217 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.623201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.623266 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.623284 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.623312 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.623334 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:18Z","lastTransitionTime":"2026-03-10T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.639802 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.649759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.650089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.650169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.650262 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.650346 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:18Z","lastTransitionTime":"2026-03-10T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.670230 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.675158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.675234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.675256 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.675280 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:18 crc kubenswrapper[4911]: I0310 14:04:18.675298 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:18Z","lastTransitionTime":"2026-03-10T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.694608 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T14:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"36e9ec0c-5432-482b-b6c8-9d6220be2548\\\",\\\"systemUUID\\\":\\\"74bee069-21da-4cc8-a69e-a4f54ba3e964\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:18Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:18 crc kubenswrapper[4911]: E0310 14:04:18.695194 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 14:04:19 crc kubenswrapper[4911]: I0310 14:04:19.192648 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:19 crc kubenswrapper[4911]: E0310 14:04:19.192951 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:20 crc kubenswrapper[4911]: I0310 14:04:20.193103 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:20 crc kubenswrapper[4911]: I0310 14:04:20.193144 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:20 crc kubenswrapper[4911]: I0310 14:04:20.193145 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:20 crc kubenswrapper[4911]: E0310 14:04:20.193271 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:20 crc kubenswrapper[4911]: E0310 14:04:20.193444 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:20 crc kubenswrapper[4911]: E0310 14:04:20.193500 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:21 crc kubenswrapper[4911]: I0310 14:04:21.192679 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:21 crc kubenswrapper[4911]: E0310 14:04:21.193172 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:21 crc kubenswrapper[4911]: E0310 14:04:21.560259 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:22 crc kubenswrapper[4911]: I0310 14:04:22.193428 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:22 crc kubenswrapper[4911]: I0310 14:04:22.193516 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:22 crc kubenswrapper[4911]: I0310 14:04:22.193577 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:22 crc kubenswrapper[4911]: E0310 14:04:22.193670 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:22 crc kubenswrapper[4911]: E0310 14:04:22.193835 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:22 crc kubenswrapper[4911]: E0310 14:04:22.193999 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:23 crc kubenswrapper[4911]: I0310 14:04:23.192355 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:23 crc kubenswrapper[4911]: E0310 14:04:23.192703 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:24 crc kubenswrapper[4911]: I0310 14:04:24.192945 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:24 crc kubenswrapper[4911]: I0310 14:04:24.192962 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:24 crc kubenswrapper[4911]: E0310 14:04:24.193156 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:24 crc kubenswrapper[4911]: I0310 14:04:24.192990 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:24 crc kubenswrapper[4911]: E0310 14:04:24.194456 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:24 crc kubenswrapper[4911]: E0310 14:04:24.194627 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:25 crc kubenswrapper[4911]: I0310 14:04:25.192904 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:25 crc kubenswrapper[4911]: E0310 14:04:25.193088 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.192689 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.192689 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.192921 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:26 crc kubenswrapper[4911]: E0310 14:04:26.193097 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:26 crc kubenswrapper[4911]: E0310 14:04:26.193244 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:26 crc kubenswrapper[4911]: E0310 14:04:26.193418 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.207838 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2970cff-e2bc-40e6-9d80-7388d88e840e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5de189dc13327a9498c27b448c0d10c7392df027d571d570defd2ed9ea81f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tg8sx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.224602 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9p6ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0fdaa42-a77f-4f62-b94f-6659225e12af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b63ac74908665e41822bf1d238dde7de4f431b9a0f0c5a5cc381953b9b7b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87sf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9p6ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.246913 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed2b430b-2281-4231-9135-f0289be08cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:04:10Z\\\",\\\"message\\\":\\\" 1 for removal\\\\nI0310 14:04:10.207822 7428 factory.go:656] Stopping watch factory\\\\nI0310 14:04:10.207835 7428 ovnkube.go:599] Stopped ovnkube\\\\nI0310 14:04:10.207812 7428 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 14:04:10.207867 7428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 14:04:10.207864 7428 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 14:04:10.207869 7428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 14:04:10.207914 7428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nF0310 14:04:10.207937 7428 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:04:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh7d8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4256n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.263119 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66be8d6e-80ee-40ee-ba3c-86ab422d6d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5905967890ef6c75106a108df85cd095d45042551df82ec8d0d7afe6b534bbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32436fc8e5804c42bbc700adffa7aa5ac5b44ec9f3f12e7586dc8a1ecc6398d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.288878 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"146331c0-4032-4f38-ab26-c56eb746dcba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://032cba5dc96c54f3169bee7bce6b3d4d37535e6919dba6e3200056c9897e63fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ebedcb9b5fc3e7abfb705845fb86fbee914d4daa3e73b91a47fb9bf6d68e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c39dee03fee6577ade9558e41a8a6c625687ffdf9b80a14696dbc55815c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2013cc6aeb85776acf6593d7eb7251a58ba1248d9da4cf9da6464dcb0ef71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b91bb44ba75aa77d3224110463b1b67c03661419427b6df80b6249195ec3f6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698a09502a71d47663c7da83f4984e7c287b4cfa575f89fe3f53711967364ad0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://692780c27061dc3947a92a2421f342a3b245bdda8e01f047141335da4329f780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f8fc980fde2bb2c1570f69b5204ef27f5e6c89aed036240669b975909185841\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.306310 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cae19a2-5349-4f13-94ab-bfe066e4589a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8431f3d3392feda333225649d1a8236d4747e038249bccbf47729b10fce77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e615fb1457509d1cf8643f83f389c229458d7c6e4769fce1153990a58dd33e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n2ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xxbj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.323839 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.342825 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z255c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"349ee2ee-803a-404b-9aa2-2230eabdbb56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d91578b429386e0b43d650071ef2f3015fde0037adeea9ca58712c928202437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a09eeb4d90deb6d1d5c8cc2653bc06f1c4bb612521c26d527915362f36c14fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88d67b30966e0de04d31d12677cfee2ba4fbafcdc7156291ecba307b048dbe5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7eb0b3128e0fb1fd1669699963575def23ee9432e1943b82d857453b43e9603\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edb6af7141bc1c6f92d323c5ab3c4a0d5038bf7e9287c18c675a6b2393cc467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497b88b1b3ea24fae27354d68da694acbfa9789bc356fa0c7aefe146cf9bbbc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe25fe11fbf557bdd95903cf38946374abf7cb38f2024deea719b40288857\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z255c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.361616 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c243d166-4ad1-46ec-ac74-f5f55b7e0fb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 14:02:42.631426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 14:02:42.631674 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 14:02:42.633089 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2667175776/tls.crt::/tmp/serving-cert-2667175776/tls.key\\\\\\\"\\\\nI0310 14:02:43.069632 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 14:02:43.077263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 14:02:43.077307 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 14:02:43.077339 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 14:02:43.077344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 14:02:43.081490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 14:02:43.081568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081601 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 14:02:43.081623 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 14:02:43.081643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 14:02:43.081662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0310 14:02:43.081498 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 14:02:43.081682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 14:02:43.085341 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:02:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.379298 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.394576 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cfb28c19c39b1dd76b697713e391914239b9c7a5b45b35b6d7f827cb8f3b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04eda9a8d1959b5189db0f28943b9cbb1124bdd6c58056deb534de1b44c784ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.412183 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26e6576a420e25dc644b2ff91ae514ba08624501377682f7f8e5c19ed45313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.429180 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-r28f8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zp4ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-r28f8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.444670 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vfj7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acf8e218-4a2a-4d62-9aa8-7ecca1109d35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b0571bd5088fa0a65e268be688b27df598a8eb55558226082588a488736a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dplb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vfj7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.460061 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31019d25-2541-4af4-9f53-c194a80f418b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5024c6215a8deeba42e2e20668fc5c95dcd413cbf3db57808c33a368c8e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154dac2faa22aacb028da5fe052db316b084248e0bac801b60d019c2a01046b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T14:02:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 14:01:38.525889 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 14:01:38.527757 1 observer_polling.go:159] Starting file observer\\\\nI0310 14:01:38.561760 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 14:01:38.565033 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 14:02:03.772765 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 14:02:03.772924 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df6f5bfa79eee91b702510101f1c1481745beb1c1789f4d5c9a463f674d6ee5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b7da6e041bb7591d3226943f03b8a47b8dc3dee9523839511958ddd3bf0d229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.475504 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3364a880-4a53-4ed2-bf42-c4e7274bc191\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5edd84b31d5917a5d402e8c5157d4b13ebbf65e276612eaaf0dde5e6283c7842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce717e6fd56762675145e49984ed7764f4ceef12e27e62a2b9d8ca5dbece70f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7086b076e4cdd45b8322e6d67939e711b5e6b553869cd6aeedda10be7bfe69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7210009b5e438df4c7297fd63914732aff050d0595161ebe7a1cf938942d532f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T14:01:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:01:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.493459 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41cea013e30ec59eb94c0404489c5349e85c85aad388276aead73f8b724752be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.516114 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: I0310 14:04:26.534974 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nsxjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc662696-d402-4969-bebd-00fa42e63075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T14:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T14:03:50Z\\\",\\\"message\\\":\\\"2026-03-10T14:03:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee\\\\n2026-03-10T14:03:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fcfe1536-b385-43a7-98e6-d5ef692825ee to /host/opt/cni/bin/\\\\n2026-03-10T14:03:05Z [verbose] multus-daemon started\\\\n2026-03-10T14:03:05Z [verbose] Readiness Indicator file check\\\\n2026-03-10T14:03:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T14:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T14:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T14:02:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nsxjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 10 14:04:26 crc kubenswrapper[4911]: E0310 14:04:26.561490 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:27 crc kubenswrapper[4911]: I0310 14:04:27.192975 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:27 crc kubenswrapper[4911]: I0310 14:04:27.194481 4911 scope.go:117] "RemoveContainer" containerID="79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a" Mar 10 14:04:27 crc kubenswrapper[4911]: E0310 14:04:27.195051 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:04:27 crc kubenswrapper[4911]: E0310 14:04:27.195644 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:28 crc kubenswrapper[4911]: I0310 14:04:28.192602 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:28 crc kubenswrapper[4911]: I0310 14:04:28.192650 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:28 crc kubenswrapper[4911]: I0310 14:04:28.192619 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:28 crc kubenswrapper[4911]: E0310 14:04:28.192798 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:28 crc kubenswrapper[4911]: E0310 14:04:28.193132 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:28 crc kubenswrapper[4911]: E0310 14:04:28.193254 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.011799 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.011851 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.011862 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.011881 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.011896 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T14:04:29Z","lastTransitionTime":"2026-03-10T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.071327 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2"] Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.071773 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.073550 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.074227 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.074850 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.075713 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.105542 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9p6ll" podStartSLOduration=139.10551942 podStartE2EDuration="2m19.10551942s" podCreationTimestamp="2026-03-10 14:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.105401537 +0000 UTC m=+173.668921504" watchObservedRunningTime="2026-03-10 14:04:29.10551942 +0000 UTC m=+173.669039337" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.105833 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podStartSLOduration=138.105826178 podStartE2EDuration="2m18.105826178s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.092524795 +0000 UTC m=+173.656044742" watchObservedRunningTime="2026-03-10 14:04:29.105826178 +0000 UTC m=+173.669346095" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.151836 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=88.151806979 podStartE2EDuration="1m28.151806979s" podCreationTimestamp="2026-03-10 14:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.15071681 +0000 UTC m=+173.714236757" watchObservedRunningTime="2026-03-10 14:04:29.151806979 +0000 UTC m=+173.715326916" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.192552 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:29 crc kubenswrapper[4911]: E0310 14:04:29.192752 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.207859 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xxbj8" podStartSLOduration=138.207835097 podStartE2EDuration="2m18.207835097s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.206988755 +0000 UTC m=+173.770508682" watchObservedRunningTime="2026-03-10 14:04:29.207835097 +0000 UTC m=+173.771355014" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.207991 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.207986101 podStartE2EDuration="1m17.207986101s" podCreationTimestamp="2026-03-10 14:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.185592427 +0000 UTC m=+173.749112344" watchObservedRunningTime="2026-03-10 14:04:29.207986101 +0000 UTC m=+173.771506018" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.234279 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9b8db190-9412-494e-b9cf-0c9b919d5218-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.234343 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8db190-9412-494e-b9cf-0c9b919d5218-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.234407 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9b8db190-9412-494e-b9cf-0c9b919d5218-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.234552 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b8db190-9412-494e-b9cf-0c9b919d5218-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.234627 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b8db190-9412-494e-b9cf-0c9b919d5218-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.249066 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.256677 4911 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.298064 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=92.298038323 podStartE2EDuration="1m32.298038323s" podCreationTimestamp="2026-03-10 14:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.261161493 +0000 UTC m=+173.824681420" watchObservedRunningTime="2026-03-10 14:04:29.298038323 +0000 UTC m=+173.861558250" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.335204 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9b8db190-9412-494e-b9cf-0c9b919d5218-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.335265 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b8db190-9412-494e-b9cf-0c9b919d5218-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.335287 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b8db190-9412-494e-b9cf-0c9b919d5218-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.335334 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9b8db190-9412-494e-b9cf-0c9b919d5218-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.335350 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8db190-9412-494e-b9cf-0c9b919d5218-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.335792 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9b8db190-9412-494e-b9cf-0c9b919d5218-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.336024 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9b8db190-9412-494e-b9cf-0c9b919d5218-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.336682 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b8db190-9412-494e-b9cf-0c9b919d5218-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.344585 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8db190-9412-494e-b9cf-0c9b919d5218-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.364963 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b8db190-9412-494e-b9cf-0c9b919d5218-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k9nw2\" (UID: \"9b8db190-9412-494e-b9cf-0c9b919d5218\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.386560 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vfj7m" podStartSLOduration=139.386539143 podStartE2EDuration="2m19.386539143s" podCreationTimestamp="2026-03-10 14:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.386132312 +0000 UTC m=+173.949652229" watchObservedRunningTime="2026-03-10 14:04:29.386539143 +0000 UTC m=+173.950059060" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.387207 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.428908 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=54.428882657 podStartE2EDuration="54.428882657s" podCreationTimestamp="2026-03-10 14:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.428343653 +0000 UTC m=+173.991863580" watchObservedRunningTime="2026-03-10 14:04:29.428882657 +0000 UTC m=+173.992402584" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.429123 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z255c" podStartSLOduration=138.429115293 podStartE2EDuration="2m18.429115293s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.407861629 +0000 UTC m=+173.971381556" watchObservedRunningTime="2026-03-10 14:04:29.429115293 +0000 UTC m=+173.992635220" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.447561 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.447529282 podStartE2EDuration="1m3.447529282s" podCreationTimestamp="2026-03-10 14:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.44594554 +0000 UTC m=+174.009465467" watchObservedRunningTime="2026-03-10 14:04:29.447529282 +0000 UTC m=+174.011049199" Mar 10 14:04:29 crc kubenswrapper[4911]: I0310 14:04:29.504979 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nsxjn" podStartSLOduration=138.504954047 podStartE2EDuration="2m18.504954047s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:29.503961871 +0000 UTC m=+174.067481788" watchObservedRunningTime="2026-03-10 14:04:29.504954047 +0000 UTC m=+174.068473964" Mar 10 14:04:30 crc kubenswrapper[4911]: I0310 14:04:30.186573 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" event={"ID":"9b8db190-9412-494e-b9cf-0c9b919d5218","Type":"ContainerStarted","Data":"38f65a5fab573120b8363a6cf992b7e17e26d12c681bbaaa72bc9bd1945a29d1"} Mar 10 14:04:30 crc kubenswrapper[4911]: I0310 14:04:30.186683 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" event={"ID":"9b8db190-9412-494e-b9cf-0c9b919d5218","Type":"ContainerStarted","Data":"5958bf2291311ef2f216543211d11c216e91aef26a711922a2336e677264fbee"} Mar 10 14:04:30 crc kubenswrapper[4911]: I0310 14:04:30.192504 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:30 crc kubenswrapper[4911]: I0310 14:04:30.192655 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:30 crc kubenswrapper[4911]: E0310 14:04:30.192905 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:30 crc kubenswrapper[4911]: E0310 14:04:30.193067 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:30 crc kubenswrapper[4911]: I0310 14:04:30.193402 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:30 crc kubenswrapper[4911]: E0310 14:04:30.193704 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:30 crc kubenswrapper[4911]: I0310 14:04:30.209768 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9nw2" podStartSLOduration=139.209703062 podStartE2EDuration="2m19.209703062s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:30.208889721 +0000 UTC m=+174.772409698" watchObservedRunningTime="2026-03-10 14:04:30.209703062 +0000 UTC m=+174.773223029" Mar 10 14:04:31 crc kubenswrapper[4911]: I0310 14:04:31.193176 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:31 crc kubenswrapper[4911]: E0310 14:04:31.193411 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:31 crc kubenswrapper[4911]: E0310 14:04:31.563382 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:32 crc kubenswrapper[4911]: I0310 14:04:32.192573 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:32 crc kubenswrapper[4911]: I0310 14:04:32.192629 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:32 crc kubenswrapper[4911]: I0310 14:04:32.192753 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:32 crc kubenswrapper[4911]: E0310 14:04:32.192909 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:32 crc kubenswrapper[4911]: E0310 14:04:32.193036 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:32 crc kubenswrapper[4911]: E0310 14:04:32.193134 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:33 crc kubenswrapper[4911]: I0310 14:04:33.192287 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:33 crc kubenswrapper[4911]: E0310 14:04:33.192532 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:34 crc kubenswrapper[4911]: I0310 14:04:34.192843 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:34 crc kubenswrapper[4911]: I0310 14:04:34.192910 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:34 crc kubenswrapper[4911]: I0310 14:04:34.192961 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:34 crc kubenswrapper[4911]: E0310 14:04:34.193115 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:34 crc kubenswrapper[4911]: E0310 14:04:34.193232 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:34 crc kubenswrapper[4911]: E0310 14:04:34.193439 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:35 crc kubenswrapper[4911]: I0310 14:04:35.192893 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:35 crc kubenswrapper[4911]: E0310 14:04:35.193334 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:36 crc kubenswrapper[4911]: I0310 14:04:36.192770 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:36 crc kubenswrapper[4911]: I0310 14:04:36.192931 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:36 crc kubenswrapper[4911]: I0310 14:04:36.192959 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:36 crc kubenswrapper[4911]: E0310 14:04:36.193217 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:36 crc kubenswrapper[4911]: E0310 14:04:36.193595 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:36 crc kubenswrapper[4911]: E0310 14:04:36.193715 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:36 crc kubenswrapper[4911]: E0310 14:04:36.564213 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:37 crc kubenswrapper[4911]: I0310 14:04:37.192906 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:37 crc kubenswrapper[4911]: E0310 14:04:37.193353 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:37 crc kubenswrapper[4911]: I0310 14:04:37.219284 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/1.log" Mar 10 14:04:37 crc kubenswrapper[4911]: I0310 14:04:37.220126 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/0.log" Mar 10 14:04:37 crc kubenswrapper[4911]: I0310 14:04:37.220214 4911 generic.go:334] "Generic (PLEG): container finished" podID="fc662696-d402-4969-bebd-00fa42e63075" containerID="434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09" exitCode=1 Mar 10 14:04:37 crc kubenswrapper[4911]: I0310 14:04:37.220466 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nsxjn" event={"ID":"fc662696-d402-4969-bebd-00fa42e63075","Type":"ContainerDied","Data":"434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09"} Mar 10 14:04:37 crc kubenswrapper[4911]: I0310 14:04:37.220647 4911 scope.go:117] "RemoveContainer" containerID="6e23e1a134f91ad89aaa89f4156b51646327c5dc0f87b30681f626a9aab07f54" Mar 10 14:04:37 crc kubenswrapper[4911]: I0310 14:04:37.221485 4911 scope.go:117] "RemoveContainer" containerID="434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09" Mar 10 14:04:37 crc kubenswrapper[4911]: E0310 14:04:37.221959 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nsxjn_openshift-multus(fc662696-d402-4969-bebd-00fa42e63075)\"" pod="openshift-multus/multus-nsxjn" podUID="fc662696-d402-4969-bebd-00fa42e63075" Mar 10 14:04:38 crc kubenswrapper[4911]: I0310 14:04:38.193414 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:38 crc kubenswrapper[4911]: I0310 14:04:38.193586 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:38 crc kubenswrapper[4911]: E0310 14:04:38.193643 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:38 crc kubenswrapper[4911]: I0310 14:04:38.193490 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:38 crc kubenswrapper[4911]: E0310 14:04:38.194450 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:38 crc kubenswrapper[4911]: E0310 14:04:38.194588 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:38 crc kubenswrapper[4911]: I0310 14:04:38.225887 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/1.log" Mar 10 14:04:39 crc kubenswrapper[4911]: I0310 14:04:39.193408 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:39 crc kubenswrapper[4911]: E0310 14:04:39.193650 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:40 crc kubenswrapper[4911]: I0310 14:04:40.192975 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:40 crc kubenswrapper[4911]: I0310 14:04:40.193105 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:40 crc kubenswrapper[4911]: I0310 14:04:40.192974 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:40 crc kubenswrapper[4911]: E0310 14:04:40.193181 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:40 crc kubenswrapper[4911]: E0310 14:04:40.193298 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:40 crc kubenswrapper[4911]: E0310 14:04:40.193578 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:40 crc kubenswrapper[4911]: I0310 14:04:40.194420 4911 scope.go:117] "RemoveContainer" containerID="79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a" Mar 10 14:04:40 crc kubenswrapper[4911]: E0310 14:04:40.194615 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4256n_openshift-ovn-kubernetes(ed2b430b-2281-4231-9135-f0289be08cdd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" Mar 10 14:04:41 crc kubenswrapper[4911]: I0310 14:04:41.192976 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:41 crc kubenswrapper[4911]: E0310 14:04:41.193304 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:41 crc kubenswrapper[4911]: E0310 14:04:41.566057 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:42 crc kubenswrapper[4911]: I0310 14:04:42.193159 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:42 crc kubenswrapper[4911]: I0310 14:04:42.193258 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:42 crc kubenswrapper[4911]: E0310 14:04:42.194001 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:42 crc kubenswrapper[4911]: I0310 14:04:42.193283 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:42 crc kubenswrapper[4911]: E0310 14:04:42.194246 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:42 crc kubenswrapper[4911]: E0310 14:04:42.194438 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:43 crc kubenswrapper[4911]: I0310 14:04:43.192453 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:43 crc kubenswrapper[4911]: E0310 14:04:43.192770 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:44 crc kubenswrapper[4911]: I0310 14:04:44.192573 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:44 crc kubenswrapper[4911]: I0310 14:04:44.192704 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:44 crc kubenswrapper[4911]: E0310 14:04:44.192911 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:44 crc kubenswrapper[4911]: I0310 14:04:44.192616 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:44 crc kubenswrapper[4911]: E0310 14:04:44.193019 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:44 crc kubenswrapper[4911]: E0310 14:04:44.193300 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:45 crc kubenswrapper[4911]: I0310 14:04:45.193216 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:45 crc kubenswrapper[4911]: E0310 14:04:45.193432 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:46 crc kubenswrapper[4911]: I0310 14:04:46.192407 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:46 crc kubenswrapper[4911]: I0310 14:04:46.192539 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:46 crc kubenswrapper[4911]: I0310 14:04:46.192406 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:46 crc kubenswrapper[4911]: E0310 14:04:46.192631 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:46 crc kubenswrapper[4911]: E0310 14:04:46.192805 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:46 crc kubenswrapper[4911]: E0310 14:04:46.193049 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:46 crc kubenswrapper[4911]: E0310 14:04:46.566996 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:47 crc kubenswrapper[4911]: I0310 14:04:47.192597 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:47 crc kubenswrapper[4911]: E0310 14:04:47.193152 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:48 crc kubenswrapper[4911]: I0310 14:04:48.193049 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:48 crc kubenswrapper[4911]: I0310 14:04:48.193122 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:48 crc kubenswrapper[4911]: I0310 14:04:48.193289 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:48 crc kubenswrapper[4911]: E0310 14:04:48.193488 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:48 crc kubenswrapper[4911]: E0310 14:04:48.193640 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:48 crc kubenswrapper[4911]: E0310 14:04:48.193884 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:49 crc kubenswrapper[4911]: I0310 14:04:49.193160 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:49 crc kubenswrapper[4911]: E0310 14:04:49.193376 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:49 crc kubenswrapper[4911]: I0310 14:04:49.194096 4911 scope.go:117] "RemoveContainer" containerID="434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09" Mar 10 14:04:50 crc kubenswrapper[4911]: I0310 14:04:50.192792 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:50 crc kubenswrapper[4911]: I0310 14:04:50.192854 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:50 crc kubenswrapper[4911]: I0310 14:04:50.192851 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:50 crc kubenswrapper[4911]: E0310 14:04:50.193575 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:50 crc kubenswrapper[4911]: E0310 14:04:50.193884 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:50 crc kubenswrapper[4911]: E0310 14:04:50.194120 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:50 crc kubenswrapper[4911]: I0310 14:04:50.275530 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/1.log" Mar 10 14:04:50 crc kubenswrapper[4911]: I0310 14:04:50.275609 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nsxjn" event={"ID":"fc662696-d402-4969-bebd-00fa42e63075","Type":"ContainerStarted","Data":"98f567cea8a7526e2d641b378c91a49bdbfd2a2fe2a7dae62170af48d16b4ae1"} Mar 10 14:04:51 crc kubenswrapper[4911]: I0310 14:04:51.192550 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:51 crc kubenswrapper[4911]: E0310 14:04:51.192755 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:51 crc kubenswrapper[4911]: E0310 14:04:51.569650 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:52 crc kubenswrapper[4911]: I0310 14:04:52.193970 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:52 crc kubenswrapper[4911]: I0310 14:04:52.194055 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:52 crc kubenswrapper[4911]: E0310 14:04:52.194563 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:52 crc kubenswrapper[4911]: I0310 14:04:52.194068 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:52 crc kubenswrapper[4911]: E0310 14:04:52.194802 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:52 crc kubenswrapper[4911]: E0310 14:04:52.194711 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:53 crc kubenswrapper[4911]: I0310 14:04:53.193325 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:53 crc kubenswrapper[4911]: E0310 14:04:53.193599 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:54 crc kubenswrapper[4911]: I0310 14:04:54.192824 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:54 crc kubenswrapper[4911]: I0310 14:04:54.192824 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:54 crc kubenswrapper[4911]: I0310 14:04:54.192996 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:54 crc kubenswrapper[4911]: E0310 14:04:54.193227 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:54 crc kubenswrapper[4911]: E0310 14:04:54.193910 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:54 crc kubenswrapper[4911]: E0310 14:04:54.193976 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:54 crc kubenswrapper[4911]: I0310 14:04:54.194578 4911 scope.go:117] "RemoveContainer" containerID="79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a" Mar 10 14:04:55 crc kubenswrapper[4911]: I0310 14:04:55.146767 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r28f8"] Mar 10 14:04:55 crc kubenswrapper[4911]: I0310 14:04:55.147273 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:55 crc kubenswrapper[4911]: E0310 14:04:55.147426 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:55 crc kubenswrapper[4911]: I0310 14:04:55.302947 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/3.log" Mar 10 14:04:55 crc kubenswrapper[4911]: I0310 14:04:55.306671 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerStarted","Data":"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803"} Mar 10 14:04:55 crc kubenswrapper[4911]: I0310 14:04:55.307156 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:04:55 crc kubenswrapper[4911]: I0310 14:04:55.338964 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podStartSLOduration=164.338936376 podStartE2EDuration="2m44.338936376s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:04:55.338040135 +0000 UTC m=+199.901560052" watchObservedRunningTime="2026-03-10 14:04:55.338936376 +0000 UTC m=+199.902456293" Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.112355 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.112444 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112493 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112582 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:06:58.112559236 +0000 UTC m=+322.676079153 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.112505 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112611 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112759 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112792 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112814 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.112755 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112876 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:06:58.112841363 +0000 UTC m=+322.676361320 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112911 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112920 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:06:58.112902224 +0000 UTC m=+322.676422371 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112948 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.112970 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.113046 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:06:58.113023787 +0000 UTC m=+322.676543744 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.192965 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.193157 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.192965 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.192999 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.193497 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.192977 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.193604 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.193646 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.314624 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.314832 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:06:58.314806566 +0000 UTC m=+322.878326483 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:04:56 crc kubenswrapper[4911]: I0310 14:04:56.314944 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.315103 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.315151 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:06:58.315141064 +0000 UTC m=+322.878660981 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 14:04:56 crc kubenswrapper[4911]: E0310 14:04:56.570718 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:04:58 crc kubenswrapper[4911]: I0310 14:04:58.192665 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:04:58 crc kubenswrapper[4911]: I0310 14:04:58.192803 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:04:58 crc kubenswrapper[4911]: I0310 14:04:58.192867 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:04:58 crc kubenswrapper[4911]: I0310 14:04:58.192972 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:04:58 crc kubenswrapper[4911]: E0310 14:04:58.192960 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:04:58 crc kubenswrapper[4911]: E0310 14:04:58.193132 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:04:58 crc kubenswrapper[4911]: E0310 14:04:58.193249 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:04:58 crc kubenswrapper[4911]: E0310 14:04:58.193321 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:05:00 crc kubenswrapper[4911]: I0310 14:05:00.193038 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:05:00 crc kubenswrapper[4911]: I0310 14:05:00.193164 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:05:00 crc kubenswrapper[4911]: E0310 14:05:00.193236 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:05:00 crc kubenswrapper[4911]: I0310 14:05:00.193175 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:05:00 crc kubenswrapper[4911]: E0310 14:05:00.193420 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:05:00 crc kubenswrapper[4911]: I0310 14:05:00.193560 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:05:00 crc kubenswrapper[4911]: E0310 14:05:00.193586 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:05:00 crc kubenswrapper[4911]: E0310 14:05:00.193791 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.192857 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.192950 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.192984 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.192857 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.196398 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.196440 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.197749 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.197862 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.197870 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 14:05:02 crc kubenswrapper[4911]: I0310 14:05:02.198039 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.270239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.330126 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.330798 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.335194 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rrzzv"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.335245 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.336029 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.336139 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.336236 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.337446 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.337488 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.337941 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.355073 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.355205 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.355772 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.356237 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.357193 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.357481 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.361481 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.362413 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.363010 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.366340 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.374356 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.382909 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.383171 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.383552 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.383650 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.384806 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.384864 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.385285 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.386035 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.386253 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.386880 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.387051 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.387793 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390327 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796125a2-c985-48d4-a6b4-c772b26b59ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390368 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-ca\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390414 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331e0c1d-7674-44da-bc5e-39358bbe9d07-serving-cert\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390447 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-client\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390488 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4xsk\" (UniqueName: \"kubernetes.io/projected/331e0c1d-7674-44da-bc5e-39358bbe9d07-kube-api-access-l4xsk\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390510 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-config\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390532 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tr9t\" (UniqueName: \"kubernetes.io/projected/67af9440-1c20-4115-9c48-9ad32fc36f31-kube-api-access-9tr9t\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390553 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-client-ca\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390572 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796125a2-c985-48d4-a6b4-c772b26b59ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390600 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-service-ca\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390634 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbnzk\" (UniqueName: \"kubernetes.io/projected/796125a2-c985-48d4-a6b4-c772b26b59ac-kube-api-access-mbnzk\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390658 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67af9440-1c20-4115-9c48-9ad32fc36f31-serving-cert\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.390677 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-config\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.391005 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m5pk6"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.392641 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4plxk"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.392878 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.394377 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.397110 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.397866 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.398143 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s5jkn"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.398651 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.400830 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.400905 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.402631 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.402784 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.403689 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.403985 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.404203 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.404427 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.404636 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.404960 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.405196 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.406693 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7tfvh"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.407599 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.407620 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.408289 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.410332 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.410674 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.410796 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.410895 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.410905 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.411009 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.411079 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.411132 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.411239 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.411334 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.411358 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.411430 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.411242 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.416356 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.419978 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420000 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420296 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420310 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420420 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420499 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420512 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420445 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420798 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420905 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420936 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.420932 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.421030 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.421332 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.421880 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f4tz6"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.422363 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.421890 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.422880 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.422989 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.423085 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.423186 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.423381 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.423386 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.421980 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.426715 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.427466 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.427556 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.427769 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.433451 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.451528 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.453297 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.454209 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqkdl"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.458088 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.458358 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.469315 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.469517 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.469650 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.469910 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.470104 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.470275 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.470443 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.470574 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.470686 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.470772 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.470913 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.471063 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.471267 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.471331 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.471591 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.471767 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.472379 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.472566 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.474367 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.474593 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.475386 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.475483 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.475637 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.476406 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.476829 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.477052 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.477482 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.477570 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.478162 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.478930 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-297k4"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.479470 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.479874 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.480229 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.480670 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.481151 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.481900 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.482347 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.482398 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.482553 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t9tlk"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.483078 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.483225 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5c4nw"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.484049 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.484749 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.489043 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.489779 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.491083 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.491467 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nfhwf"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.491837 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.492077 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.492216 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493110 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hvl4\" (UniqueName: \"kubernetes.io/projected/96814f30-58d3-4501-bb95-185465b9df21-kube-api-access-4hvl4\") pod \"migrator-59844c95c7-gwspk\" (UID: \"96814f30-58d3-4501-bb95-185465b9df21\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493138 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5sx\" (UniqueName: \"kubernetes.io/projected/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-kube-api-access-hx5sx\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493159 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493176 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-trusted-ca\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493203 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-client\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493222 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9d8\" (UniqueName: \"kubernetes.io/projected/20ce004d-fd3b-4770-8ba5-39f80a0b1c8a-kube-api-access-8v9d8\") pod \"dns-operator-744455d44c-t9tlk\" (UID: \"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493242 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05d84958-d228-406a-9337-389f9a5f286d-service-ca-bundle\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493264 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493282 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h5hz\" (UniqueName: \"kubernetes.io/projected/39208142-b788-4b42-a0f2-421544f8833f-kube-api-access-4h5hz\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493306 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c79f6f9-05f1-44eb-8565-f0d888ee5163-metrics-tls\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493325 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493344 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-metrics-certs\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493364 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493380 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-serving-cert\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493412 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4xsk\" (UniqueName: \"kubernetes.io/projected/331e0c1d-7674-44da-bc5e-39358bbe9d07-kube-api-access-l4xsk\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493433 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493454 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493472 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-config\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493490 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-policies\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493508 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-stats-auth\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493525 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493543 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tr9t\" (UniqueName: \"kubernetes.io/projected/67af9440-1c20-4115-9c48-9ad32fc36f31-kube-api-access-9tr9t\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493561 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a409f0e6-8131-4025-b258-842a71a125b6-audit-dir\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493579 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89q2\" (UniqueName: \"kubernetes.io/projected/c671ff01-6a3e-4626-8e8a-c78feb4b3491-kube-api-access-v89q2\") pod \"multus-admission-controller-857f4d67dd-297k4\" (UID: \"c671ff01-6a3e-4626-8e8a-c78feb4b3491\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493595 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mv88\" (UniqueName: \"kubernetes.io/projected/6e3c1bcf-b096-4526-8b7c-d304a5afa191-kube-api-access-4mv88\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493612 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-serving-cert\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493629 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-client-ca\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493647 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796125a2-c985-48d4-a6b4-c772b26b59ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493663 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p5wv\" (UniqueName: \"kubernetes.io/projected/05d84958-d228-406a-9337-389f9a5f286d-kube-api-access-2p5wv\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493681 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-client-ca\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493696 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493712 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq766\" (UniqueName: \"kubernetes.io/projected/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-kube-api-access-pq766\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493746 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-audit-policies\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493760 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c79f6f9-05f1-44eb-8565-f0d888ee5163-trusted-ca\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493778 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-service-ca\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493793 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493808 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxrsh\" (UniqueName: \"kubernetes.io/projected/2160b26f-7876-42d1-8d78-22f6f57cb08e-kube-api-access-sxrsh\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493826 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcv48\" (UniqueName: \"kubernetes.io/projected/61dbbc3f-94f4-4c65-8c0b-7181159fcae3-kube-api-access-dcv48\") pod \"control-plane-machine-set-operator-78cbb6b69f-4s5dj\" (UID: \"61dbbc3f-94f4-4c65-8c0b-7181159fcae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493840 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frs8g\" (UniqueName: \"kubernetes.io/projected/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-kube-api-access-frs8g\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493857 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c79f6f9-05f1-44eb-8565-f0d888ee5163-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493874 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d9f774-c786-4afb-9ffd-57f2b9e0064e-config\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493892 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493909 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxh5v\" (UniqueName: \"kubernetes.io/projected/a409f0e6-8131-4025-b258-842a71a125b6-kube-api-access-hxh5v\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493927 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/92d9f774-c786-4afb-9ffd-57f2b9e0064e-machine-approver-tls\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493946 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493961 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/39208142-b788-4b42-a0f2-421544f8833f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.493987 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/61dbbc3f-94f4-4c65-8c0b-7181159fcae3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4s5dj\" (UID: \"61dbbc3f-94f4-4c65-8c0b-7181159fcae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494015 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e9294c6c-c62f-44eb-a4c9-a09523a5965b-images\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494030 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9294c6c-c62f-44eb-a4c9-a09523a5965b-proxy-tls\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494045 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494067 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e3c1bcf-b096-4526-8b7c-d304a5afa191-serving-cert\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494086 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9294c6c-c62f-44eb-a4c9-a09523a5965b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494105 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c671ff01-6a3e-4626-8e8a-c78feb4b3491-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-297k4\" (UID: \"c671ff01-6a3e-4626-8e8a-c78feb4b3491\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494122 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hrpl\" (UniqueName: \"kubernetes.io/projected/2c79f6f9-05f1-44eb-8565-f0d888ee5163-kube-api-access-5hrpl\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494138 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494162 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbnzk\" (UniqueName: \"kubernetes.io/projected/796125a2-c985-48d4-a6b4-c772b26b59ac-kube-api-access-mbnzk\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494178 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-encryption-config\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494195 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-dir\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494209 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-config\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494224 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494241 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494255 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494273 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67af9440-1c20-4115-9c48-9ad32fc36f31-serving-cert\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494287 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-config\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494302 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-serving-cert\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494318 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-default-certificate\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494346 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796125a2-c985-48d4-a6b4-c772b26b59ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494361 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-etcd-client\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494376 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-ca\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494392 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494408 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39208142-b788-4b42-a0f2-421544f8833f-config\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494432 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-service-ca-bundle\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494449 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-config\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494468 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494485 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-config\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494502 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-config\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494518 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331e0c1d-7674-44da-bc5e-39358bbe9d07-serving-cert\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494532 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh88d\" (UniqueName: \"kubernetes.io/projected/92d9f774-c786-4afb-9ffd-57f2b9e0064e-kube-api-access-kh88d\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494547 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs2fn\" (UniqueName: \"kubernetes.io/projected/e9294c6c-c62f-44eb-a4c9-a09523a5965b-kube-api-access-hs2fn\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494563 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92d9f774-c786-4afb-9ffd-57f2b9e0064e-auth-proxy-config\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494578 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494593 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/39208142-b788-4b42-a0f2-421544f8833f-images\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.494609 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20ce004d-fd3b-4770-8ba5-39f80a0b1c8a-metrics-tls\") pod \"dns-operator-744455d44c-t9tlk\" (UID: \"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.495690 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-client-ca\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.495804 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796125a2-c985-48d4-a6b4-c772b26b59ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.496654 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-config\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.496741 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.496839 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-ca\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.497410 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.497405 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-service-ca\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.500472 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-config\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.507108 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.508209 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.509066 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.509942 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.510020 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lcvpv"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.510869 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.511681 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.512069 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.513004 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331e0c1d-7674-44da-bc5e-39358bbe9d07-serving-cert\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.513366 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.520505 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67af9440-1c20-4115-9c48-9ad32fc36f31-serving-cert\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.525169 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67af9440-1c20-4115-9c48-9ad32fc36f31-etcd-client\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.528507 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.535265 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796125a2-c985-48d4-a6b4-c772b26b59ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.535694 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lpjl7"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.537978 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.539879 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.541106 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.541436 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.542645 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.546118 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.551056 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-74p5x"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.551839 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zrr58"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.552042 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.552391 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.555043 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rrzzv"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.556283 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rbltz"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.556707 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.557205 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.563546 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.564861 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.565836 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.568812 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552524-98vcv"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.569538 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552524-98vcv" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.571036 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.572037 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.573184 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.573304 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.574408 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.577561 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.580741 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.585000 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s5jkn"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.586248 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7tfvh"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.587349 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rmmf6"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.589880 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4plxk"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.590011 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.590208 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f4tz6"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.591560 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.592526 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595106 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c671ff01-6a3e-4626-8e8a-c78feb4b3491-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-297k4\" (UID: \"c671ff01-6a3e-4626-8e8a-c78feb4b3491\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595145 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hrpl\" (UniqueName: \"kubernetes.io/projected/2c79f6f9-05f1-44eb-8565-f0d888ee5163-kube-api-access-5hrpl\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595177 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e3c1bcf-b096-4526-8b7c-d304a5afa191-serving-cert\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595206 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t9tlk"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595210 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9294c6c-c62f-44eb-a4c9-a09523a5965b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595328 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595375 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-encryption-config\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595403 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-dir\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595452 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-config\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595477 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595501 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595532 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595558 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-default-certificate\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595585 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-serving-cert\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595636 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-etcd-client\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595656 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595680 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39208142-b788-4b42-a0f2-421544f8833f-config\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595745 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595779 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-service-ca-bundle\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595807 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-config\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595835 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-config\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595870 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-config\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595911 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh88d\" (UniqueName: \"kubernetes.io/projected/92d9f774-c786-4afb-9ffd-57f2b9e0064e-kube-api-access-kh88d\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595951 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs2fn\" (UniqueName: \"kubernetes.io/projected/e9294c6c-c62f-44eb-a4c9-a09523a5965b-kube-api-access-hs2fn\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595985 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.596015 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/39208142-b788-4b42-a0f2-421544f8833f-images\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.596048 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20ce004d-fd3b-4770-8ba5-39f80a0b1c8a-metrics-tls\") pod \"dns-operator-744455d44c-t9tlk\" (UID: \"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.596092 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92d9f774-c786-4afb-9ffd-57f2b9e0064e-auth-proxy-config\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.596126 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hvl4\" (UniqueName: \"kubernetes.io/projected/96814f30-58d3-4501-bb95-185465b9df21-kube-api-access-4hvl4\") pod \"migrator-59844c95c7-gwspk\" (UID: \"96814f30-58d3-4501-bb95-185465b9df21\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.597344 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.597749 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.597834 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.598548 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/39208142-b788-4b42-a0f2-421544f8833f-images\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.598927 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-etcd-client\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.600055 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lpjl7"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.600652 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.601602 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39208142-b788-4b42-a0f2-421544f8833f-config\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.601644 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.601793 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.602222 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-config\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.602452 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.602962 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-config\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.603021 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9294c6c-c62f-44eb-a4c9-a09523a5965b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.603263 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-encryption-config\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.603747 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-config\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.604114 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c671ff01-6a3e-4626-8e8a-c78feb4b3491-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-297k4\" (UID: \"c671ff01-6a3e-4626-8e8a-c78feb4b3491\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.604470 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-service-ca-bundle\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.604539 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92d9f774-c786-4afb-9ffd-57f2b9e0064e-auth-proxy-config\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.604522 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.604572 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-297k4"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.605536 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e3c1bcf-b096-4526-8b7c-d304a5afa191-serving-cert\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.595862 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-dir\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.605661 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.605757 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a409f0e6-8131-4025-b258-842a71a125b6-serving-cert\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.605870 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-config\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.606843 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.601713 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5sx\" (UniqueName: \"kubernetes.io/projected/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-kube-api-access-hx5sx\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.608857 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.608905 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-trusted-ca\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.608933 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9d8\" (UniqueName: \"kubernetes.io/projected/20ce004d-fd3b-4770-8ba5-39f80a0b1c8a-kube-api-access-8v9d8\") pod \"dns-operator-744455d44c-t9tlk\" (UID: \"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.608964 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05d84958-d228-406a-9337-389f9a5f286d-service-ca-bundle\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.608982 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609002 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h5hz\" (UniqueName: \"kubernetes.io/projected/39208142-b788-4b42-a0f2-421544f8833f-kube-api-access-4h5hz\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609023 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609044 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c79f6f9-05f1-44eb-8565-f0d888ee5163-metrics-tls\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609062 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-metrics-certs\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609105 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609125 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-serving-cert\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609185 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609204 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609228 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-stats-auth\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609248 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609268 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-policies\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609813 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v89q2\" (UniqueName: \"kubernetes.io/projected/c671ff01-6a3e-4626-8e8a-c78feb4b3491-kube-api-access-v89q2\") pod \"multus-admission-controller-857f4d67dd-297k4\" (UID: \"c671ff01-6a3e-4626-8e8a-c78feb4b3491\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609843 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a409f0e6-8131-4025-b258-842a71a125b6-audit-dir\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609922 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mv88\" (UniqueName: \"kubernetes.io/projected/6e3c1bcf-b096-4526-8b7c-d304a5afa191-kube-api-access-4mv88\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609941 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-serving-cert\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.609991 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p5wv\" (UniqueName: \"kubernetes.io/projected/05d84958-d228-406a-9337-389f9a5f286d-kube-api-access-2p5wv\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610010 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-client-ca\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610034 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610052 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq766\" (UniqueName: \"kubernetes.io/projected/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-kube-api-access-pq766\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610076 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-audit-policies\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610099 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c79f6f9-05f1-44eb-8565-f0d888ee5163-trusted-ca\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610119 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxrsh\" (UniqueName: \"kubernetes.io/projected/2160b26f-7876-42d1-8d78-22f6f57cb08e-kube-api-access-sxrsh\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610143 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610401 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcv48\" (UniqueName: \"kubernetes.io/projected/61dbbc3f-94f4-4c65-8c0b-7181159fcae3-kube-api-access-dcv48\") pod \"control-plane-machine-set-operator-78cbb6b69f-4s5dj\" (UID: \"61dbbc3f-94f4-4c65-8c0b-7181159fcae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610426 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frs8g\" (UniqueName: \"kubernetes.io/projected/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-kube-api-access-frs8g\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610467 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c79f6f9-05f1-44eb-8565-f0d888ee5163-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610495 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxh5v\" (UniqueName: \"kubernetes.io/projected/a409f0e6-8131-4025-b258-842a71a125b6-kube-api-access-hxh5v\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610514 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d9f774-c786-4afb-9ffd-57f2b9e0064e-config\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610531 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610548 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610565 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/39208142-b788-4b42-a0f2-421544f8833f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610582 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/61dbbc3f-94f4-4c65-8c0b-7181159fcae3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4s5dj\" (UID: \"61dbbc3f-94f4-4c65-8c0b-7181159fcae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610610 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/92d9f774-c786-4afb-9ffd-57f2b9e0064e-machine-approver-tls\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610630 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e9294c6c-c62f-44eb-a4c9-a09523a5965b-images\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610646 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9294c6c-c62f-44eb-a4c9-a09523a5965b-proxy-tls\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.610661 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.611536 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.614305 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-serving-cert\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.614774 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.615315 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.616787 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-trusted-ca\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.617909 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.618097 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.619160 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c79f6f9-05f1-44eb-8565-f0d888ee5163-metrics-tls\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.619302 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e3c1bcf-b096-4526-8b7c-d304a5afa191-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.619811 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-policies\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.619855 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a409f0e6-8131-4025-b258-842a71a125b6-audit-dir\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.620037 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.620449 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-client-ca\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.620495 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d9f774-c786-4afb-9ffd-57f2b9e0064e-config\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.620665 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.622704 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a409f0e6-8131-4025-b258-842a71a125b6-audit-policies\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.623003 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/92d9f774-c786-4afb-9ffd-57f2b9e0064e-machine-approver-tls\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.623350 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c79f6f9-05f1-44eb-8565-f0d888ee5163-trusted-ca\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.623411 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.623495 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.623536 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e9294c6c-c62f-44eb-a4c9-a09523a5965b-images\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.624020 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/61dbbc3f-94f4-4c65-8c0b-7181159fcae3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4s5dj\" (UID: \"61dbbc3f-94f4-4c65-8c0b-7181159fcae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.624528 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/39208142-b788-4b42-a0f2-421544f8833f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.624600 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9294c6c-c62f-44eb-a4c9-a09523a5965b-proxy-tls\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.625811 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-serving-cert\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.627034 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.628520 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.629263 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.631377 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m5pk6"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.632705 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.632994 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.634185 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.635437 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.636789 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqkdl"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.638205 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.640451 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.641856 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.642870 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-74p5x"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.646133 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.654314 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.655466 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.656919 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.657977 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552524-98vcv"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.659206 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8z6vx"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.660117 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.661375 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5hb5z"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.662285 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zrr58"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.662403 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5hb5z" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.663026 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rbltz"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.664241 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.665532 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.667034 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nfhwf"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.668189 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5hb5z"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.669298 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8z6vx"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.670380 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lcvpv"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.671418 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rmmf6"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.672574 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jjrkk"] Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.673290 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.673309 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.713279 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.733224 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.744359 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20ce004d-fd3b-4770-8ba5-39f80a0b1c8a-metrics-tls\") pod \"dns-operator-744455d44c-t9tlk\" (UID: \"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.755515 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.772784 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.793783 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.813429 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.833499 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.839384 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-default-certificate\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.853875 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.860370 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-stats-auth\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.873119 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.884222 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05d84958-d228-406a-9337-389f9a5f286d-metrics-certs\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.893975 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.898508 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05d84958-d228-406a-9337-389f9a5f286d-service-ca-bundle\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.913766 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.952515 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.973118 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 14:05:10 crc kubenswrapper[4911]: I0310 14:05:10.994248 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.013621 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.033230 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.053952 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.073745 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.094030 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.114452 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.134127 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.153094 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.192405 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbnzk\" (UniqueName: \"kubernetes.io/projected/796125a2-c985-48d4-a6b4-c772b26b59ac-kube-api-access-mbnzk\") pod \"openshift-apiserver-operator-796bbdcf4f-82dx2\" (UID: \"796125a2-c985-48d4-a6b4-c772b26b59ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.213148 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tr9t\" (UniqueName: \"kubernetes.io/projected/67af9440-1c20-4115-9c48-9ad32fc36f31-kube-api-access-9tr9t\") pod \"etcd-operator-b45778765-rrzzv\" (UID: \"67af9440-1c20-4115-9c48-9ad32fc36f31\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.233943 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.241589 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4xsk\" (UniqueName: \"kubernetes.io/projected/331e0c1d-7674-44da-bc5e-39358bbe9d07-kube-api-access-l4xsk\") pod \"route-controller-manager-6576b87f9c-ljh7q\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.253872 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.274433 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.283765 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.293654 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.294665 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.314631 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.328718 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.334234 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.353581 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.373749 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.394801 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.413684 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.433349 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.456769 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.473974 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.494906 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.512092 4911 request.go:700] Waited for 1.000672268s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/configmaps?fieldSelector=metadata.name%3Detcd-serving-ca&limit=500&resourceVersion=0 Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.515249 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.533869 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.548876 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2"] Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.564525 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.573346 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.594359 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.613551 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.633036 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.653961 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.675221 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.693299 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.705567 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rrzzv"] Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.712363 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q"] Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.719553 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 14:05:11 crc kubenswrapper[4911]: W0310 14:05:11.722226 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331e0c1d_7674_44da_bc5e_39358bbe9d07.slice/crio-e6edfd041c49d60c9f106587281aacd3b06962808377846be3d0a00956e78ff6 WatchSource:0}: Error finding container e6edfd041c49d60c9f106587281aacd3b06962808377846be3d0a00956e78ff6: Status 404 returned error can't find the container with id e6edfd041c49d60c9f106587281aacd3b06962808377846be3d0a00956e78ff6 Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.734470 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.753382 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.774266 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.795588 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.814155 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.833104 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.853666 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.873564 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.892895 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.914649 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.932936 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.952812 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.973379 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 14:05:11 crc kubenswrapper[4911]: I0310 14:05:11.994662 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.019448 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.032850 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.053663 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.074012 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.093910 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.113369 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.133867 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.153806 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.173850 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.194820 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.213848 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.234352 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.253585 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.274834 4911 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.293347 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.333653 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hrpl\" (UniqueName: \"kubernetes.io/projected/2c79f6f9-05f1-44eb-8565-f0d888ee5163-kube-api-access-5hrpl\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.354964 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs2fn\" (UniqueName: \"kubernetes.io/projected/e9294c6c-c62f-44eb-a4c9-a09523a5965b-kube-api-access-hs2fn\") pod \"machine-config-operator-74547568cd-bqrb8\" (UID: \"e9294c6c-c62f-44eb-a4c9-a09523a5965b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.372191 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hvl4\" (UniqueName: \"kubernetes.io/projected/96814f30-58d3-4501-bb95-185465b9df21-kube-api-access-4hvl4\") pod \"migrator-59844c95c7-gwspk\" (UID: \"96814f30-58d3-4501-bb95-185465b9df21\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.390285 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh88d\" (UniqueName: \"kubernetes.io/projected/92d9f774-c786-4afb-9ffd-57f2b9e0064e-kube-api-access-kh88d\") pod \"machine-approver-56656f9798-sgkhd\" (UID: \"92d9f774-c786-4afb-9ffd-57f2b9e0064e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.399792 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" event={"ID":"796125a2-c985-48d4-a6b4-c772b26b59ac","Type":"ContainerStarted","Data":"7053f635d5f943e7da4a2102c5a4475e74d415ed33a3419bec22a7c3a0730049"} Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.399847 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" event={"ID":"796125a2-c985-48d4-a6b4-c772b26b59ac","Type":"ContainerStarted","Data":"05c88bda4db8b6546f4c3cc13d20e1fc2636db8386030e6737353a1f43ab7cd5"} Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.401188 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" event={"ID":"67af9440-1c20-4115-9c48-9ad32fc36f31","Type":"ContainerStarted","Data":"4b6ac404a9f77cf28072b12c8cd687ddaa6453dccdf937a2d49230bfc6e1695a"} Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.401230 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" event={"ID":"67af9440-1c20-4115-9c48-9ad32fc36f31","Type":"ContainerStarted","Data":"5cc1c4769a7f5723fd68ce0ec494c2581203a29eaf152794bea069d97187cecb"} Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.402327 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" event={"ID":"331e0c1d-7674-44da-bc5e-39358bbe9d07","Type":"ContainerStarted","Data":"34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d"} Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.402355 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" event={"ID":"331e0c1d-7674-44da-bc5e-39358bbe9d07","Type":"ContainerStarted","Data":"e6edfd041c49d60c9f106587281aacd3b06962808377846be3d0a00956e78ff6"} Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.402658 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.415500 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5sx\" (UniqueName: \"kubernetes.io/projected/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-kube-api-access-hx5sx\") pod \"controller-manager-879f6c89f-pqkdl\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.446228 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa0f80b6-3d76-4b00-a67c-e5ace1795e58-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6j5j7\" (UID: \"aa0f80b6-3d76-4b00-a67c-e5ace1795e58\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.449459 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9d8\" (UniqueName: \"kubernetes.io/projected/20ce004d-fd3b-4770-8ba5-39f80a0b1c8a-kube-api-access-8v9d8\") pod \"dns-operator-744455d44c-t9tlk\" (UID: \"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.458578 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.471608 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.473529 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frs8g\" (UniqueName: \"kubernetes.io/projected/4dd0502d-ca6f-45ff-8a4c-bfff96edaf04-kube-api-access-frs8g\") pod \"console-operator-58897d9998-4plxk\" (UID: \"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04\") " pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.487957 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxrsh\" (UniqueName: \"kubernetes.io/projected/2160b26f-7876-42d1-8d78-22f6f57cb08e-kube-api-access-sxrsh\") pod \"oauth-openshift-558db77b4-7tfvh\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.488213 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.512232 4911 request.go:700] Waited for 1.893738995s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/control-plane-machine-set-operator/token Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.530464 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h5hz\" (UniqueName: \"kubernetes.io/projected/39208142-b788-4b42-a0f2-421544f8833f-kube-api-access-4h5hz\") pod \"machine-api-operator-5694c8668f-s5jkn\" (UID: \"39208142-b788-4b42-a0f2-421544f8833f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.530605 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcv48\" (UniqueName: \"kubernetes.io/projected/61dbbc3f-94f4-4c65-8c0b-7181159fcae3-kube-api-access-dcv48\") pod \"control-plane-machine-set-operator-78cbb6b69f-4s5dj\" (UID: \"61dbbc3f-94f4-4c65-8c0b-7181159fcae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.579514 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.584924 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.587458 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxh5v\" (UniqueName: \"kubernetes.io/projected/a409f0e6-8131-4025-b258-842a71a125b6-kube-api-access-hxh5v\") pod \"apiserver-7bbb656c7d-cqz22\" (UID: \"a409f0e6-8131-4025-b258-842a71a125b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.610505 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p5wv\" (UniqueName: \"kubernetes.io/projected/05d84958-d228-406a-9337-389f9a5f286d-kube-api-access-2p5wv\") pod \"router-default-5444994796-5c4nw\" (UID: \"05d84958-d228-406a-9337-389f9a5f286d\") " pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.620624 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mv88\" (UniqueName: \"kubernetes.io/projected/6e3c1bcf-b096-4526-8b7c-d304a5afa191-kube-api-access-4mv88\") pod \"authentication-operator-69f744f599-f4tz6\" (UID: \"6e3c1bcf-b096-4526-8b7c-d304a5afa191\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.621798 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c79f6f9-05f1-44eb-8565-f0d888ee5163-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wlwqw\" (UID: \"2c79f6f9-05f1-44eb-8565-f0d888ee5163\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:12 crc kubenswrapper[4911]: W0310 14:05:12.622639 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d9f774_c786_4afb_9ffd_57f2b9e0064e.slice/crio-9cac21863a89c1694da0c19bad2752e0337837dff410e89a56bc37dbdf546287 WatchSource:0}: Error finding container 9cac21863a89c1694da0c19bad2752e0337837dff410e89a56bc37dbdf546287: Status 404 returned error can't find the container with id 9cac21863a89c1694da0c19bad2752e0337837dff410e89a56bc37dbdf546287 Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.629536 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.630275 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89q2\" (UniqueName: \"kubernetes.io/projected/c671ff01-6a3e-4626-8e8a-c78feb4b3491-kube-api-access-v89q2\") pod \"multus-admission-controller-857f4d67dd-297k4\" (UID: \"c671ff01-6a3e-4626-8e8a-c78feb4b3491\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.647947 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.651934 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq766\" (UniqueName: \"kubernetes.io/projected/f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0-kube-api-access-pq766\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzzd\" (UID: \"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.653845 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.659194 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.679660 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.680462 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.693062 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.698052 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.714513 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.733416 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.735270 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.777847 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8"] Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.778427 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.779660 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.780064 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.781014 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.782704 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.795122 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.804263 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.813955 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.823037 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.837384 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.865858 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqkdl"] Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.890867 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.911961 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t9tlk"] Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982583 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzt2l\" (UniqueName: \"kubernetes.io/projected/91baa02e-ad6d-425e-9be1-03972ce6d4ec-kube-api-access-gzt2l\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982626 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91baa02e-ad6d-425e-9be1-03972ce6d4ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982672 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-tls\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982695 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f89e6f0d-a78a-4543-9b03-ad1245748d9a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982766 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-bound-sa-token\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982823 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982878 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f89e6f0d-a78a-4543-9b03-ad1245748d9a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982900 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-certificates\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982933 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91baa02e-ad6d-425e-9be1-03972ce6d4ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982957 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7ecb36c-d108-4771-bdc6-0d3489756695-proxy-tls\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.982984 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/596d1d14-4266-4409-87fb-0a155d8e69a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bsj4b\" (UID: \"596d1d14-4266-4409-87fb-0a155d8e69a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.983004 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtlnj\" (UniqueName: \"kubernetes.io/projected/596d1d14-4266-4409-87fb-0a155d8e69a4-kube-api-access-mtlnj\") pod \"cluster-samples-operator-665b6dd947-bsj4b\" (UID: \"596d1d14-4266-4409-87fb-0a155d8e69a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.983027 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-trusted-ca\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.983054 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pj9t\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-kube-api-access-7pj9t\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.983079 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5lzt\" (UniqueName: \"kubernetes.io/projected/a7ecb36c-d108-4771-bdc6-0d3489756695-kube-api-access-p5lzt\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:12 crc kubenswrapper[4911]: I0310 14:05:12.983140 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7ecb36c-d108-4771-bdc6-0d3489756695-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:12 crc kubenswrapper[4911]: E0310 14:05:12.985694 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:13.4856694 +0000 UTC m=+218.049189537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: W0310 14:05:13.022917 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c1b4d2_0c1e_4cc5_8f1f_a3d724e5d62d.slice/crio-dab234ec58546b342e8a901a5976470077920993c44ace14f24f5eb6ffd2f771 WatchSource:0}: Error finding container dab234ec58546b342e8a901a5976470077920993c44ace14f24f5eb6ffd2f771: Status 404 returned error can't find the container with id dab234ec58546b342e8a901a5976470077920993c44ace14f24f5eb6ffd2f771 Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.086531 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087114 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-socket-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087166 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxtb\" (UniqueName: \"kubernetes.io/projected/c2c2a7e9-e34c-4dbf-81fa-945a91578cd7-kube-api-access-fbxtb\") pod \"package-server-manager-789f6589d5-pmkwn\" (UID: \"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087185 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-serving-cert\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087225 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4371fc-62a4-4ef6-ab10-4d742f56b8de-config\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087242 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c5745a8f-aa4e-41e6-8693-a253ffff59a8-node-bootstrap-token\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087260 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2f1294ca-e6aa-4294-826d-3d37b345aea2-profile-collector-cert\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087277 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-oauth-config\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087293 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-audit-dir\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087329 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f89e6f0d-a78a-4543-9b03-ad1245748d9a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087355 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-certificates\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087371 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-etcd-client\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087387 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-service-ca\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087405 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-srv-cert\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087441 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91baa02e-ad6d-425e-9be1-03972ce6d4ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087467 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/596d1d14-4266-4409-87fb-0a155d8e69a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bsj4b\" (UID: \"596d1d14-4266-4409-87fb-0a155d8e69a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087484 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlnj\" (UniqueName: \"kubernetes.io/projected/596d1d14-4266-4409-87fb-0a155d8e69a4-kube-api-access-mtlnj\") pod \"cluster-samples-operator-665b6dd947-bsj4b\" (UID: \"596d1d14-4266-4409-87fb-0a155d8e69a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087500 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9fl4\" (UniqueName: \"kubernetes.io/projected/1ad00bf3-d146-4d5d-806e-fb340e3762bf-kube-api-access-d9fl4\") pod \"downloads-7954f5f757-nfhwf\" (UID: \"1ad00bf3-d146-4d5d-806e-fb340e3762bf\") " pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087566 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-config-volume\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087581 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c5745a8f-aa4e-41e6-8693-a253ffff59a8-certs\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087596 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-trusted-ca\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087614 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pj9t\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-kube-api-access-7pj9t\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087632 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rv5j\" (UniqueName: \"kubernetes.io/projected/c5745a8f-aa4e-41e6-8693-a253ffff59a8-kube-api-access-8rv5j\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087667 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5lzt\" (UniqueName: \"kubernetes.io/projected/a7ecb36c-d108-4771-bdc6-0d3489756695-kube-api-access-p5lzt\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.087712 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7ecb36c-d108-4771-bdc6-0d3489756695-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.093986 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:13.593950938 +0000 UTC m=+218.157470855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096090 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4nr5\" (UniqueName: \"kubernetes.io/projected/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-kube-api-access-x4nr5\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096190 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-encryption-config\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096229 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c33024d8-cc44-404d-91c1-f10018a0ffe4-tmpfs\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096271 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnx7l\" (UniqueName: \"kubernetes.io/projected/c33024d8-cc44-404d-91c1-f10018a0ffe4-kube-api-access-wnx7l\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096323 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svhjh\" (UniqueName: \"kubernetes.io/projected/ff12c614-1348-42b6-ab73-75e3f89c20aa-kube-api-access-svhjh\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096354 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnqgx\" (UniqueName: \"kubernetes.io/projected/28f8f1c7-122d-47a4-8de8-90db75c3365b-kube-api-access-fnqgx\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096381 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de4371fc-62a4-4ef6-ab10-4d742f56b8de-serving-cert\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096410 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d20f65c-3043-4310-83f1-300d0283f9b4-config-volume\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096420 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-certificates\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096452 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096836 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-config\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096897 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c33024d8-cc44-404d-91c1-f10018a0ffe4-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096917 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-oauth-serving-cert\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096948 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d20f65c-3043-4310-83f1-300d0283f9b4-secret-volume\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.096982 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcvlp\" (UniqueName: \"kubernetes.io/projected/de4371fc-62a4-4ef6-ab10-4d742f56b8de-kube-api-access-pcvlp\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.097005 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjtp\" (UniqueName: \"kubernetes.io/projected/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-kube-api-access-dbjtp\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.094407 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f89e6f0d-a78a-4543-9b03-ad1245748d9a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.097522 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-trusted-ca\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.097720 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.097942 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-image-import-ca\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.097983 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c2a7e9-e34c-4dbf-81fa-945a91578cd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-pmkwn\" (UID: \"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098009 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9vwh\" (UniqueName: \"kubernetes.io/projected/4ca54095-1004-4c8c-a8bd-89a2c28fae76-kube-api-access-k9vwh\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098034 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7j7x\" (UniqueName: \"kubernetes.io/projected/0d20f65c-3043-4310-83f1-300d0283f9b4-kube-api-access-s7j7x\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098076 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-bound-sa-token\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098100 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-audit\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098136 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-etcd-serving-ca\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098240 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2f1294ca-e6aa-4294-826d-3d37b345aea2-srv-cert\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098276 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-registration-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098343 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7ecb36c-d108-4771-bdc6-0d3489756695-proxy-tls\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098365 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a081d982-3537-4193-9dcc-fa423bdd0fbc-cert\") pod \"ingress-canary-5hb5z\" (UID: \"a081d982-3537-4193-9dcc-fa423bdd0fbc\") " pod="openshift-ingress-canary/ingress-canary-5hb5z" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098420 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/826d5f3c-f7f7-404e-b5c5-6a33cde74892-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098462 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-csi-data-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098500 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4ca54095-1004-4c8c-a8bd-89a2c28fae76-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098521 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-mountpoint-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098541 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-config\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098563 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8856x\" (UniqueName: \"kubernetes.io/projected/a081d982-3537-4193-9dcc-fa423bdd0fbc-kube-api-access-8856x\") pod \"ingress-canary-5hb5z\" (UID: \"a081d982-3537-4193-9dcc-fa423bdd0fbc\") " pod="openshift-ingress-canary/ingress-canary-5hb5z" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098585 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ff12c614-1348-42b6-ab73-75e3f89c20aa-signing-cabundle\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098607 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fff7433-4b4b-457c-a985-3cd636960250-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098644 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098685 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-profile-collector-cert\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098753 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ff12c614-1348-42b6-ab73-75e3f89c20aa-signing-key\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098776 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-plugins-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098811 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-node-pullsecrets\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098847 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c33024d8-cc44-404d-91c1-f10018a0ffe4-webhook-cert\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098868 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fff7433-4b4b-457c-a985-3cd636960250-config\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098888 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77q4t\" (UniqueName: \"kubernetes.io/projected/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-kube-api-access-77q4t\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098913 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p42rn\" (UniqueName: \"kubernetes.io/projected/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-kube-api-access-p42rn\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098938 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnfms\" (UniqueName: \"kubernetes.io/projected/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-kube-api-access-tnfms\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.098960 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826d5f3c-f7f7-404e-b5c5-6a33cde74892-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099000 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzt2l\" (UniqueName: \"kubernetes.io/projected/91baa02e-ad6d-425e-9be1-03972ce6d4ec-kube-api-access-gzt2l\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099022 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099063 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91baa02e-ad6d-425e-9be1-03972ce6d4ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099103 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn6ml\" (UniqueName: \"kubernetes.io/projected/2f1294ca-e6aa-4294-826d-3d37b345aea2-kube-api-access-zn6ml\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099125 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-metrics-tls\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099146 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca54095-1004-4c8c-a8bd-89a2c28fae76-serving-cert\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099166 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/826d5f3c-f7f7-404e-b5c5-6a33cde74892-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099222 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-serving-cert\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099244 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-trusted-ca-bundle\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099280 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-tls\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099305 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f89e6f0d-a78a-4543-9b03-ad1245748d9a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099340 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdwt\" (UniqueName: \"kubernetes.io/projected/30365124-15de-458c-b8f8-97b0fab41da4-kube-api-access-hqdwt\") pod \"auto-csr-approver-29552524-98vcv\" (UID: \"30365124-15de-458c-b8f8-97b0fab41da4\") " pod="openshift-infra/auto-csr-approver-29552524-98vcv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099364 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099431 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fff7433-4b4b-457c-a985-3cd636960250-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099486 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.099512 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2v7\" (UniqueName: \"kubernetes.io/projected/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-kube-api-access-rg2v7\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.108982 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91baa02e-ad6d-425e-9be1-03972ce6d4ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.111635 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91baa02e-ad6d-425e-9be1-03972ce6d4ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.113052 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/596d1d14-4266-4409-87fb-0a155d8e69a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bsj4b\" (UID: \"596d1d14-4266-4409-87fb-0a155d8e69a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.115937 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f89e6f0d-a78a-4543-9b03-ad1245748d9a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.116307 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s5jkn"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.117494 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7ecb36c-d108-4771-bdc6-0d3489756695-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.119492 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7ecb36c-d108-4771-bdc6-0d3489756695-proxy-tls\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.127757 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-tls\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.176399 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pj9t\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-kube-api-access-7pj9t\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.178696 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5lzt\" (UniqueName: \"kubernetes.io/projected/a7ecb36c-d108-4771-bdc6-0d3489756695-kube-api-access-p5lzt\") pod \"machine-config-controller-84d6567774-gh5gn\" (UID: \"a7ecb36c-d108-4771-bdc6-0d3489756695\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201333 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rv5j\" (UniqueName: \"kubernetes.io/projected/c5745a8f-aa4e-41e6-8693-a253ffff59a8-kube-api-access-8rv5j\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201570 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nr5\" (UniqueName: \"kubernetes.io/projected/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-kube-api-access-x4nr5\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201599 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnx7l\" (UniqueName: \"kubernetes.io/projected/c33024d8-cc44-404d-91c1-f10018a0ffe4-kube-api-access-wnx7l\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201625 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-encryption-config\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201646 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c33024d8-cc44-404d-91c1-f10018a0ffe4-tmpfs\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201667 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svhjh\" (UniqueName: \"kubernetes.io/projected/ff12c614-1348-42b6-ab73-75e3f89c20aa-kube-api-access-svhjh\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201690 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnqgx\" (UniqueName: \"kubernetes.io/projected/28f8f1c7-122d-47a4-8de8-90db75c3365b-kube-api-access-fnqgx\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201714 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201751 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de4371fc-62a4-4ef6-ab10-4d742f56b8de-serving-cert\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201772 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d20f65c-3043-4310-83f1-300d0283f9b4-config-volume\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201794 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-config\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201814 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-oauth-serving-cert\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201836 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d20f65c-3043-4310-83f1-300d0283f9b4-secret-volume\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201861 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c33024d8-cc44-404d-91c1-f10018a0ffe4-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201884 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcvlp\" (UniqueName: \"kubernetes.io/projected/de4371fc-62a4-4ef6-ab10-4d742f56b8de-kube-api-access-pcvlp\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.201908 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.202709 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-bound-sa-token\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203381 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjtp\" (UniqueName: \"kubernetes.io/projected/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-kube-api-access-dbjtp\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203420 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-image-import-ca\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203446 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7j7x\" (UniqueName: \"kubernetes.io/projected/0d20f65c-3043-4310-83f1-300d0283f9b4-kube-api-access-s7j7x\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203509 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c33024d8-cc44-404d-91c1-f10018a0ffe4-tmpfs\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203551 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c2a7e9-e34c-4dbf-81fa-945a91578cd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-pmkwn\" (UID: \"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203585 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9vwh\" (UniqueName: \"kubernetes.io/projected/4ca54095-1004-4c8c-a8bd-89a2c28fae76-kube-api-access-k9vwh\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203609 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-audit\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203626 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-oauth-serving-cert\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203629 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-etcd-serving-ca\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203696 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-config\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203704 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2f1294ca-e6aa-4294-826d-3d37b345aea2-srv-cert\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.203772 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d20f65c-3043-4310-83f1-300d0283f9b4-config-volume\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.204874 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.205593 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-image-import-ca\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.205947 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-registration-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206412 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a081d982-3537-4193-9dcc-fa423bdd0fbc-cert\") pod \"ingress-canary-5hb5z\" (UID: \"a081d982-3537-4193-9dcc-fa423bdd0fbc\") " pod="openshift-ingress-canary/ingress-canary-5hb5z" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206441 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-csi-data-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206463 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/826d5f3c-f7f7-404e-b5c5-6a33cde74892-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206484 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4ca54095-1004-4c8c-a8bd-89a2c28fae76-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206502 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-mountpoint-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206523 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ff12c614-1348-42b6-ab73-75e3f89c20aa-signing-cabundle\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206542 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fff7433-4b4b-457c-a985-3cd636960250-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206560 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-config\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206579 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8856x\" (UniqueName: \"kubernetes.io/projected/a081d982-3537-4193-9dcc-fa423bdd0fbc-kube-api-access-8856x\") pod \"ingress-canary-5hb5z\" (UID: \"a081d982-3537-4193-9dcc-fa423bdd0fbc\") " pod="openshift-ingress-canary/ingress-canary-5hb5z" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206596 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206631 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-profile-collector-cert\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206651 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-node-pullsecrets\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206667 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ff12c614-1348-42b6-ab73-75e3f89c20aa-signing-key\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206683 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-plugins-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206704 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c33024d8-cc44-404d-91c1-f10018a0ffe4-webhook-cert\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206742 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fff7433-4b4b-457c-a985-3cd636960250-config\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206765 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77q4t\" (UniqueName: \"kubernetes.io/projected/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-kube-api-access-77q4t\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206784 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p42rn\" (UniqueName: \"kubernetes.io/projected/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-kube-api-access-p42rn\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206803 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnfms\" (UniqueName: \"kubernetes.io/projected/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-kube-api-access-tnfms\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206830 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4ca54095-1004-4c8c-a8bd-89a2c28fae76-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206832 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206882 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826d5f3c-f7f7-404e-b5c5-6a33cde74892-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206910 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn6ml\" (UniqueName: \"kubernetes.io/projected/2f1294ca-e6aa-4294-826d-3d37b345aea2-kube-api-access-zn6ml\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206933 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-metrics-tls\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206944 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-registration-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.207542 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826d5f3c-f7f7-404e-b5c5-6a33cde74892-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.207808 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-audit\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.207862 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.207893 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-etcd-serving-ca\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.207905 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-mountpoint-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.208526 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ff12c614-1348-42b6-ab73-75e3f89c20aa-signing-cabundle\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.208760 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtlnj\" (UniqueName: \"kubernetes.io/projected/596d1d14-4266-4409-87fb-0a155d8e69a4-kube-api-access-mtlnj\") pod \"cluster-samples-operator-665b6dd947-bsj4b\" (UID: \"596d1d14-4266-4409-87fb-0a155d8e69a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.208791 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4plxk"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209030 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-encryption-config\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206955 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca54095-1004-4c8c-a8bd-89a2c28fae76-serving-cert\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209114 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/826d5f3c-f7f7-404e-b5c5-6a33cde74892-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209145 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-serving-cert\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209169 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-trusted-ca-bundle\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209210 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdwt\" (UniqueName: \"kubernetes.io/projected/30365124-15de-458c-b8f8-97b0fab41da4-kube-api-access-hqdwt\") pod \"auto-csr-approver-29552524-98vcv\" (UID: \"30365124-15de-458c-b8f8-97b0fab41da4\") " pod="openshift-infra/auto-csr-approver-29552524-98vcv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209239 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209272 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fff7433-4b4b-457c-a985-3cd636960250-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209315 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209342 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2v7\" (UniqueName: \"kubernetes.io/projected/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-kube-api-access-rg2v7\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209372 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209395 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-socket-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209415 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-serving-cert\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209445 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxtb\" (UniqueName: \"kubernetes.io/projected/c2c2a7e9-e34c-4dbf-81fa-945a91578cd7-kube-api-access-fbxtb\") pod \"package-server-manager-789f6589d5-pmkwn\" (UID: \"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209473 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4371fc-62a4-4ef6-ab10-4d742f56b8de-config\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209497 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c5745a8f-aa4e-41e6-8693-a253ffff59a8-node-bootstrap-token\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209523 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2f1294ca-e6aa-4294-826d-3d37b345aea2-profile-collector-cert\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209545 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-oauth-config\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209569 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-audit-dir\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209595 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-srv-cert\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209626 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-etcd-client\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209649 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-service-ca\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.209694 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9fl4\" (UniqueName: \"kubernetes.io/projected/1ad00bf3-d146-4d5d-806e-fb340e3762bf-kube-api-access-d9fl4\") pod \"downloads-7954f5f757-nfhwf\" (UID: \"1ad00bf3-d146-4d5d-806e-fb340e3762bf\") " pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.206537 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-csi-data-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.210488 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-socket-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.210926 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c33024d8-cc44-404d-91c1-f10018a0ffe4-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.210826 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-config-volume\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.211251 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d20f65c-3043-4310-83f1-300d0283f9b4-secret-volume\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.211480 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.211776 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:13.711752215 +0000 UTC m=+218.275272302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.214542 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c5745a8f-aa4e-41e6-8693-a253ffff59a8-certs\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.212040 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fff7433-4b4b-457c-a985-3cd636960250-config\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.212072 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-plugins-dir\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.212922 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4371fc-62a4-4ef6-ab10-4d742f56b8de-config\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.213125 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2f1294ca-e6aa-4294-826d-3d37b345aea2-srv-cert\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.213304 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-audit-dir\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.213486 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-service-ca\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.213827 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-config-volume\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.214270 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-trusted-ca-bundle\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.214412 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-node-pullsecrets\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.211966 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-config\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.217009 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de4371fc-62a4-4ef6-ab10-4d742f56b8de-serving-cert\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.217531 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.217930 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a081d982-3537-4193-9dcc-fa423bdd0fbc-cert\") pod \"ingress-canary-5hb5z\" (UID: \"a081d982-3537-4193-9dcc-fa423bdd0fbc\") " pod="openshift-ingress-canary/ingress-canary-5hb5z" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.218133 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.218663 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/826d5f3c-f7f7-404e-b5c5-6a33cde74892-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.219294 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-srv-cert\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.219676 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-oauth-config\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.220099 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c5745a8f-aa4e-41e6-8693-a253ffff59a8-certs\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.220468 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-etcd-client\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.220538 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-profile-collector-cert\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.221641 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c2a7e9-e34c-4dbf-81fa-945a91578cd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-pmkwn\" (UID: \"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.222194 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-metrics-tls\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.222665 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca54095-1004-4c8c-a8bd-89a2c28fae76-serving-cert\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.222997 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fff7433-4b4b-457c-a985-3cd636960250-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.223606 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ff12c614-1348-42b6-ab73-75e3f89c20aa-signing-key\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.224504 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2f1294ca-e6aa-4294-826d-3d37b345aea2-profile-collector-cert\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.224831 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-serving-cert\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.225386 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c5745a8f-aa4e-41e6-8693-a253ffff59a8-node-bootstrap-token\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.226153 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzt2l\" (UniqueName: \"kubernetes.io/projected/91baa02e-ad6d-425e-9be1-03972ce6d4ec-kube-api-access-gzt2l\") pod \"kube-storage-version-migrator-operator-b67b599dd-v6zbd\" (UID: \"91baa02e-ad6d-425e-9be1-03972ce6d4ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.232286 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-serving-cert\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.240181 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c33024d8-cc44-404d-91c1-f10018a0ffe4-webhook-cert\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.249031 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rv5j\" (UniqueName: \"kubernetes.io/projected/c5745a8f-aa4e-41e6-8693-a253ffff59a8-kube-api-access-8rv5j\") pod \"machine-config-server-jjrkk\" (UID: \"c5745a8f-aa4e-41e6-8693-a253ffff59a8\") " pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.271194 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jjrkk" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.274297 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcvlp\" (UniqueName: \"kubernetes.io/projected/de4371fc-62a4-4ef6-ab10-4d742f56b8de-kube-api-access-pcvlp\") pod \"service-ca-operator-777779d784-74p5x\" (UID: \"de4371fc-62a4-4ef6-ab10-4d742f56b8de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: W0310 14:05:13.284576 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd0502d_ca6f_45ff_8a4c_bfff96edaf04.slice/crio-cd5b6265a6545e7ed120a79bddfc3c0cd30bb25608b568a6b87d1c6bc6971097 WatchSource:0}: Error finding container cd5b6265a6545e7ed120a79bddfc3c0cd30bb25608b568a6b87d1c6bc6971097: Status 404 returned error can't find the container with id cd5b6265a6545e7ed120a79bddfc3c0cd30bb25608b568a6b87d1c6bc6971097 Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.284880 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.302504 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nr5\" (UniqueName: \"kubernetes.io/projected/44e30f3b-b91b-42c7-9a50-35b8b08f2e25-kube-api-access-x4nr5\") pod \"olm-operator-6b444d44fb-47624\" (UID: \"44e30f3b-b91b-42c7-9a50-35b8b08f2e25\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.316045 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.316647 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:13.81661442 +0000 UTC m=+218.380134337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.318579 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnx7l\" (UniqueName: \"kubernetes.io/projected/c33024d8-cc44-404d-91c1-f10018a0ffe4-kube-api-access-wnx7l\") pod \"packageserver-d55dfcdfc-c4sbh\" (UID: \"c33024d8-cc44-404d-91c1-f10018a0ffe4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.327005 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.340579 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnqgx\" (UniqueName: \"kubernetes.io/projected/28f8f1c7-122d-47a4-8de8-90db75c3365b-kube-api-access-fnqgx\") pod \"console-f9d7485db-lpjl7\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.353905 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svhjh\" (UniqueName: \"kubernetes.io/projected/ff12c614-1348-42b6-ab73-75e3f89c20aa-kube-api-access-svhjh\") pod \"service-ca-9c57cc56f-zrr58\" (UID: \"ff12c614-1348-42b6-ab73-75e3f89c20aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.373505 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9vwh\" (UniqueName: \"kubernetes.io/projected/4ca54095-1004-4c8c-a8bd-89a2c28fae76-kube-api-access-k9vwh\") pod \"openshift-config-operator-7777fb866f-n2q7w\" (UID: \"4ca54095-1004-4c8c-a8bd-89a2c28fae76\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.384335 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.402769 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7j7x\" (UniqueName: \"kubernetes.io/projected/0d20f65c-3043-4310-83f1-300d0283f9b4-kube-api-access-s7j7x\") pod \"collect-profiles-29552520-b2vs2\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.417066 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" event={"ID":"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a","Type":"ContainerStarted","Data":"bf0af60ea306237df06e055c517280d064f912aa67b8e7a1c59808d1911c538d"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.417123 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" event={"ID":"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a","Type":"ContainerStarted","Data":"5c08c20960e4ba87f415667daafa149384c7556d48e1a15e6aa5f4d7da70739c"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.418604 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.420511 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:13.920497182 +0000 UTC m=+218.484017099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.422784 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.424988 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jjrkk" event={"ID":"c5745a8f-aa4e-41e6-8693-a253ffff59a8","Type":"ContainerStarted","Data":"776dbfa411b6596b4c1e3c0b66ef00944a4c0d0b3b9acb823938afbd814d9161"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.426405 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4plxk" event={"ID":"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04","Type":"ContainerStarted","Data":"cd5b6265a6545e7ed120a79bddfc3c0cd30bb25608b568a6b87d1c6bc6971097"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.427626 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjtp\" (UniqueName: \"kubernetes.io/projected/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-kube-api-access-dbjtp\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.429185 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" event={"ID":"39208142-b788-4b42-a0f2-421544f8833f","Type":"ContainerStarted","Data":"85eff2b3b0244b9c3a5ec0df62e81d4e4ecfe9a2ed0ca1cb1f78c30b80451352"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.432210 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" event={"ID":"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d","Type":"ContainerStarted","Data":"958f6b47ebb0ea654802512722c2af73cfbcfabdcf539e93b1f46d16ca3febc9"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.432237 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" event={"ID":"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d","Type":"ContainerStarted","Data":"dab234ec58546b342e8a901a5976470077920993c44ace14f24f5eb6ffd2f771"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.433782 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.433879 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/826d5f3c-f7f7-404e-b5c5-6a33cde74892-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8vtxn\" (UID: \"826d5f3c-f7f7-404e-b5c5-6a33cde74892\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.435083 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5c4nw" event={"ID":"05d84958-d228-406a-9337-389f9a5f286d","Type":"ContainerStarted","Data":"bf85030e55af7c4d475c47357bb7e3ea7cc66f0898080cd1dc60710a46755f0a"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.435133 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5c4nw" event={"ID":"05d84958-d228-406a-9337-389f9a5f286d","Type":"ContainerStarted","Data":"287c73df68879d76801c51112c2a210729417e8be26788ab3d2633cc639ec2ae"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.437298 4911 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pqkdl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.437361 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" podUID="d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.439565 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" event={"ID":"e9294c6c-c62f-44eb-a4c9-a09523a5965b","Type":"ContainerStarted","Data":"1f9dc49513d424ab191133737546d8743dfc3b93be86273819d858e36c8c301d"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.439631 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" event={"ID":"e9294c6c-c62f-44eb-a4c9-a09523a5965b","Type":"ContainerStarted","Data":"f40278b9f049fd3bfebdad14ad01e5d3c1ef341805a8d2afba2d882bf222d2f4"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.439644 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" event={"ID":"e9294c6c-c62f-44eb-a4c9-a09523a5965b","Type":"ContainerStarted","Data":"1b1cecb74f00f2b2d81fe76d68c757ed3f0abed80732eb15131373d7af9cd94f"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.441742 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" event={"ID":"92d9f774-c786-4afb-9ffd-57f2b9e0064e","Type":"ContainerStarted","Data":"d012305c9909343497315155d109bb885dc076c1e25ad119fc0cd7a04729cf14"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.441799 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" event={"ID":"92d9f774-c786-4afb-9ffd-57f2b9e0064e","Type":"ContainerStarted","Data":"9cac21863a89c1694da0c19bad2752e0337837dff410e89a56bc37dbdf546287"} Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.452713 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.455120 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn6ml\" (UniqueName: \"kubernetes.io/projected/2f1294ca-e6aa-4294-826d-3d37b345aea2-kube-api-access-zn6ml\") pod \"catalog-operator-68c6474976-kmlz4\" (UID: \"2f1294ca-e6aa-4294-826d-3d37b345aea2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.460178 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.461407 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.474092 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.474492 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.483443 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.491470 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fff7433-4b4b-457c-a985-3cd636960250-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jfn5v\" (UID: \"0fff7433-4b4b-457c-a985-3cd636960250\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.495204 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/203f9b7d-fef8-4d09-96b9-8cfaa2e75901-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nl7jj\" (UID: \"203f9b7d-fef8-4d09-96b9-8cfaa2e75901\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.503431 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.523516 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.525914 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.025873449 +0000 UTC m=+218.589393366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.528787 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.532995 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.533023 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.032995941 +0000 UTC m=+218.596515858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.535410 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxtb\" (UniqueName: \"kubernetes.io/projected/c2c2a7e9-e34c-4dbf-81fa-945a91578cd7-kube-api-access-fbxtb\") pod \"package-server-manager-789f6589d5-pmkwn\" (UID: \"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.536160 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2v7\" (UniqueName: \"kubernetes.io/projected/f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7-kube-api-access-rg2v7\") pod \"dns-default-8z6vx\" (UID: \"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7\") " pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.558271 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.559630 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8856x\" (UniqueName: \"kubernetes.io/projected/a081d982-3537-4193-9dcc-fa423bdd0fbc-kube-api-access-8856x\") pod \"ingress-canary-5hb5z\" (UID: \"a081d982-3537-4193-9dcc-fa423bdd0fbc\") " pod="openshift-ingress-canary/ingress-canary-5hb5z" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.566514 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5hb5z" Mar 10 14:05:13 crc kubenswrapper[4911]: W0310 14:05:13.571371 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa0f80b6_3d76_4b00_a67c_e5ace1795e58.slice/crio-4d18930a64d983b4d5635531a0a34420234f109b3f98f93e977bf616f94bdd1b WatchSource:0}: Error finding container 4d18930a64d983b4d5635531a0a34420234f109b3f98f93e977bf616f94bdd1b: Status 404 returned error can't find the container with id 4d18930a64d983b4d5635531a0a34420234f109b3f98f93e977bf616f94bdd1b Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.599541 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p42rn\" (UniqueName: \"kubernetes.io/projected/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-kube-api-access-p42rn\") pod \"marketplace-operator-79b997595-rbltz\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.600218 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdwt\" (UniqueName: \"kubernetes.io/projected/30365124-15de-458c-b8f8-97b0fab41da4-kube-api-access-hqdwt\") pod \"auto-csr-approver-29552524-98vcv\" (UID: \"30365124-15de-458c-b8f8-97b0fab41da4\") " pod="openshift-infra/auto-csr-approver-29552524-98vcv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.628618 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77q4t\" (UniqueName: \"kubernetes.io/projected/ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37-kube-api-access-77q4t\") pod \"csi-hostpathplugin-rmmf6\" (UID: \"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37\") " pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.631088 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.632840 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnfms\" (UniqueName: \"kubernetes.io/projected/3ed3ce1b-b217-44d0-8f24-57c9eb678ea7-kube-api-access-tnfms\") pod \"apiserver-76f77b778f-lcvpv\" (UID: \"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7\") " pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.633960 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.133940912 +0000 UTC m=+218.697460839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.639682 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7tfvh"] Mar 10 14:05:13 crc kubenswrapper[4911]: W0310 14:05:13.649275 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c79f6f9_05f1_44eb_8565_f0d888ee5163.slice/crio-ea2f2851491430bfd133a1ef751163b21334f7502ad86caa4b96dc2ecab1492c WatchSource:0}: Error finding container ea2f2851491430bfd133a1ef751163b21334f7502ad86caa4b96dc2ecab1492c: Status 404 returned error can't find the container with id ea2f2851491430bfd133a1ef751163b21334f7502ad86caa4b96dc2ecab1492c Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.656377 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9fl4\" (UniqueName: \"kubernetes.io/projected/1ad00bf3-d146-4d5d-806e-fb340e3762bf-kube-api-access-d9fl4\") pod \"downloads-7954f5f757-nfhwf\" (UID: \"1ad00bf3-d146-4d5d-806e-fb340e3762bf\") " pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.704858 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.712656 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.718419 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" Mar 10 14:05:13 crc kubenswrapper[4911]: W0310 14:05:13.725013 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2160b26f_7876_42d1_8d78_22f6f57cb08e.slice/crio-6807294a814c105ef7219b233c1cc643734827f9fd24d91dfce76bae3a18c60e WatchSource:0}: Error finding container 6807294a814c105ef7219b233c1cc643734827f9fd24d91dfce76bae3a18c60e: Status 404 returned error can't find the container with id 6807294a814c105ef7219b233c1cc643734827f9fd24d91dfce76bae3a18c60e Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.726230 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.732612 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.732983 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.733013 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.232999486 +0000 UTC m=+218.796519403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.737956 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.746416 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.793565 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.797159 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.805431 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:13 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:13 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:13 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.805487 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.826297 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552524-98vcv" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.835129 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.836312 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.336286844 +0000 UTC m=+218.899806761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.845485 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.868950 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.869045 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.887129 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-297k4"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.891769 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f4tz6"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.936880 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:13 crc kubenswrapper[4911]: E0310 14:05:13.937199 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.437186403 +0000 UTC m=+219.000706310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.941407 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.970105 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn"] Mar 10 14:05:13 crc kubenswrapper[4911]: I0310 14:05:13.995201 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.017943 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.039353 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.039708 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.539673672 +0000 UTC m=+219.103193589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.039975 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.043699 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.543664328 +0000 UTC m=+219.107184255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.121703 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.123541 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.142803 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.142974 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.642941218 +0000 UTC m=+219.206461135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.143094 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.143472 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.643458851 +0000 UTC m=+219.206978768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.193147 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-82dx2" podStartSLOduration=183.193112647 podStartE2EDuration="3m3.193112647s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:14.158019042 +0000 UTC m=+218.721538959" watchObservedRunningTime="2026-03-10 14:05:14.193112647 +0000 UTC m=+218.756632584" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.244330 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.245157 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zrr58"] Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.245264 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.745235022 +0000 UTC m=+219.308755119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.345320 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.346919 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.347397 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.847377962 +0000 UTC m=+219.410897879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.353005 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.368321 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" podStartSLOduration=183.368296355 podStartE2EDuration="3m3.368296355s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:14.366107823 +0000 UTC m=+218.929627740" watchObservedRunningTime="2026-03-10 14:05:14.368296355 +0000 UTC m=+218.931816272" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.373683 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w"] Mar 10 14:05:14 crc kubenswrapper[4911]: W0310 14:05:14.383244 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91baa02e_ad6d_425e_9be1_03972ce6d4ec.slice/crio-76a33d0f9f61ae6a63f76f7a53d6489516d4f054b1942e08314748fec73b7338 WatchSource:0}: Error finding container 76a33d0f9f61ae6a63f76f7a53d6489516d4f054b1942e08314748fec73b7338: Status 404 returned error can't find the container with id 76a33d0f9f61ae6a63f76f7a53d6489516d4f054b1942e08314748fec73b7338 Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.402347 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5c4nw" podStartSLOduration=183.402324305 podStartE2EDuration="3m3.402324305s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:14.400909931 +0000 UTC m=+218.964429838" watchObservedRunningTime="2026-03-10 14:05:14.402324305 +0000 UTC m=+218.965844222" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.435309 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" podStartSLOduration=183.435285849 podStartE2EDuration="3m3.435285849s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:14.434408288 +0000 UTC m=+218.997928205" watchObservedRunningTime="2026-03-10 14:05:14.435285849 +0000 UTC m=+218.998805766" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.448008 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.448442 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:14.948422115 +0000 UTC m=+219.511942032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: W0310 14:05:14.450782 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d20f65c_3043_4310_83f1_300d0283f9b4.slice/crio-736e5f7f64cbefe8f9391a8d0fa29f06494536b0b6af941be546ef62301d9d9c WatchSource:0}: Error finding container 736e5f7f64cbefe8f9391a8d0fa29f06494536b0b6af941be546ef62301d9d9c: Status 404 returned error can't find the container with id 736e5f7f64cbefe8f9391a8d0fa29f06494536b0b6af941be546ef62301d9d9c Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.505759 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lpjl7"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.552183 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.554044 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.054026988 +0000 UTC m=+219.617546905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.555811 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5hb5z"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.584174 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-74p5x"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.610178 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" event={"ID":"92d9f774-c786-4afb-9ffd-57f2b9e0064e","Type":"ContainerStarted","Data":"c6d8cfe41c45e078ead529e69cf1fa93a15688a65711b8c73d0ba0371260ad68"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.620555 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" event={"ID":"20ce004d-fd3b-4770-8ba5-39f80a0b1c8a","Type":"ContainerStarted","Data":"442f00bf773e3181d9c770dc4e5505d4972f035c19403f5e30cc769ab8ab7401"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.632545 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rrzzv" podStartSLOduration=183.632528029 podStartE2EDuration="3m3.632528029s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:14.631892053 +0000 UTC m=+219.195411960" watchObservedRunningTime="2026-03-10 14:05:14.632528029 +0000 UTC m=+219.196047946" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.654791 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.655167 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.155126403 +0000 UTC m=+219.718646320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.655360 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.656853 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.156842204 +0000 UTC m=+219.720362321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.664466 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" event={"ID":"c671ff01-6a3e-4626-8e8a-c78feb4b3491","Type":"ContainerStarted","Data":"0f25c8f7b40a0074bf60abd8d2dc333fb5f98e9051d5e2e04cfd3576a962ffd2"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.678370 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" event={"ID":"ff12c614-1348-42b6-ab73-75e3f89c20aa","Type":"ContainerStarted","Data":"763672bb409a6263fc0ad4fc3d2f9e4e630d5f38cd0961a3ab7d205d42244d28"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.740581 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" event={"ID":"aa0f80b6-3d76-4b00-a67c-e5ace1795e58","Type":"ContainerStarted","Data":"4d18930a64d983b4d5635531a0a34420234f109b3f98f93e977bf616f94bdd1b"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.758251 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" event={"ID":"39208142-b788-4b42-a0f2-421544f8833f","Type":"ContainerStarted","Data":"e6cec76f700736392f0fe21f4cc070a4e5e3fe072394df286576089bb2ed8c45"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.760582 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.762705 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.262672063 +0000 UTC m=+219.826191980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.771556 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.772397 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.272378397 +0000 UTC m=+219.835898314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.780336 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" event={"ID":"61dbbc3f-94f4-4c65-8c0b-7181159fcae3","Type":"ContainerStarted","Data":"b4f4c13bb5381eb9e6c92d2ce3951cdadd14a480cbc81fa04e8de000ca010442"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.793000 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" event={"ID":"2c79f6f9-05f1-44eb-8565-f0d888ee5163","Type":"ContainerStarted","Data":"ea2f2851491430bfd133a1ef751163b21334f7502ad86caa4b96dc2ecab1492c"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.801444 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:14 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:14 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:14 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.802197 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.803611 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" event={"ID":"c33024d8-cc44-404d-91c1-f10018a0ffe4","Type":"ContainerStarted","Data":"21a232358384e2fbe9bc477fbc37344c3f12987e284c2aa3ab05c59e7a41ca96"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.813467 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" event={"ID":"2160b26f-7876-42d1-8d78-22f6f57cb08e","Type":"ContainerStarted","Data":"6807294a814c105ef7219b233c1cc643734827f9fd24d91dfce76bae3a18c60e"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.821058 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" event={"ID":"6e3c1bcf-b096-4526-8b7c-d304a5afa191","Type":"ContainerStarted","Data":"b0292bd88a3402d69c2001faf29a7f67f7cef07edba8527611cfb259e5e4faa9"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.823980 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.844578 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nfhwf"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.847108 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.850701 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.850777 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.850794 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4plxk" event={"ID":"4dd0502d-ca6f-45ff-8a4c-bfff96edaf04","Type":"ContainerStarted","Data":"a827547a140d5df8d866cfa85faf4525c87ffcff4137dfb5352f4f5804a58414"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.852419 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rmmf6"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.852539 4911 patch_prober.go:28] interesting pod/console-operator-58897d9998-4plxk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.852566 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4plxk" podUID="4dd0502d-ca6f-45ff-8a4c-bfff96edaf04" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.857342 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.859826 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lcvpv"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.863845 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.866096 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" event={"ID":"91baa02e-ad6d-425e-9be1-03972ce6d4ec","Type":"ContainerStarted","Data":"76a33d0f9f61ae6a63f76f7a53d6489516d4f054b1942e08314748fec73b7338"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.872533 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.872768 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" event={"ID":"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0","Type":"ContainerStarted","Data":"f0bf7f0ebca7ad67047757f39966206b32014566d7981e1b1d70620c66d102ee"} Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.872825 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.372796335 +0000 UTC m=+219.936316252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.873202 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.873657 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.373649415 +0000 UTC m=+219.937169322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:14 crc kubenswrapper[4911]: W0310 14:05:14.879627 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod826d5f3c_f7f7_404e_b5c5_6a33cde74892.slice/crio-3ae25f54928a5311f48f5d1e33b01d1fc820bb9aa708a14d2e98b1f59dbf49dc WatchSource:0}: Error finding container 3ae25f54928a5311f48f5d1e33b01d1fc820bb9aa708a14d2e98b1f59dbf49dc: Status 404 returned error can't find the container with id 3ae25f54928a5311f48f5d1e33b01d1fc820bb9aa708a14d2e98b1f59dbf49dc Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.883992 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" event={"ID":"44e30f3b-b91b-42c7-9a50-35b8b08f2e25","Type":"ContainerStarted","Data":"78b6cc0cb430c128ce564e738a66ea457359e220454ec5dc274801fdaf62fa58"} Mar 10 14:05:14 crc kubenswrapper[4911]: W0310 14:05:14.886790 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203f9b7d_fef8_4d09_96b9_8cfaa2e75901.slice/crio-f5ab376c07b11f4f9ef6195ff2280391927d61764108b56448800a0924cab5fd WatchSource:0}: Error finding container f5ab376c07b11f4f9ef6195ff2280391927d61764108b56448800a0924cab5fd: Status 404 returned error can't find the container with id f5ab376c07b11f4f9ef6195ff2280391927d61764108b56448800a0924cab5fd Mar 10 14:05:14 crc kubenswrapper[4911]: W0310 14:05:14.889791 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1294ca_e6aa_4294_826d_3d37b345aea2.slice/crio-746c992df7b4d80558672dac24689fd61920e38dc462ab84d614d0a16d0d1b99 WatchSource:0}: Error finding container 746c992df7b4d80558672dac24689fd61920e38dc462ab84d614d0a16d0d1b99: Status 404 returned error can't find the container with id 746c992df7b4d80558672dac24689fd61920e38dc462ab84d614d0a16d0d1b99 Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.890071 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" event={"ID":"a7ecb36c-d108-4771-bdc6-0d3489756695","Type":"ContainerStarted","Data":"155f610ba6f23a78a5d783124bc23fae8de7b55e2ee916d060b8d5a69323aa79"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.892459 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" event={"ID":"96814f30-58d3-4501-bb95-185465b9df21","Type":"ContainerStarted","Data":"6d48fb3a44bb32a8472a5c6398fdff4aa319a5b02bc430861236317e7ee75ddf"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.895341 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" event={"ID":"a409f0e6-8131-4025-b258-842a71a125b6","Type":"ContainerStarted","Data":"bab0e243f805fa8cc00994bd0a79daf9b617e0dfc1876b168d4c04d1ce14d96e"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.898152 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jjrkk" event={"ID":"c5745a8f-aa4e-41e6-8693-a253ffff59a8","Type":"ContainerStarted","Data":"e7c5397bf1fd3d567af218a7d53b2c74b07330483048a62f77b49a9978fb762e"} Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.899649 4911 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pqkdl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.899721 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" podUID="d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.906034 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bqrb8" podStartSLOduration=183.906015615 podStartE2EDuration="3m3.906015615s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:14.904056808 +0000 UTC m=+219.467576725" watchObservedRunningTime="2026-03-10 14:05:14.906015615 +0000 UTC m=+219.469535532" Mar 10 14:05:14 crc kubenswrapper[4911]: W0310 14:05:14.959785 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2c2a7e9_e34c_4dbf_81fa_945a91578cd7.slice/crio-a4713ac8816352e8436c0052367e16fbe899ca284ef657194ebcacaff946a4e8 WatchSource:0}: Error finding container a4713ac8816352e8436c0052367e16fbe899ca284ef657194ebcacaff946a4e8: Status 404 returned error can't find the container with id a4713ac8816352e8436c0052367e16fbe899ca284ef657194ebcacaff946a4e8 Mar 10 14:05:14 crc kubenswrapper[4911]: W0310 14:05:14.960171 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad0d7b23_d43a_4ee1_b3af_4bd65b7d8f37.slice/crio-d08deb3cb534ad7bb8b5e34c7f553f0965ca4675c712abe206e393d051e7b526 WatchSource:0}: Error finding container d08deb3cb534ad7bb8b5e34c7f553f0965ca4675c712abe206e393d051e7b526: Status 404 returned error can't find the container with id d08deb3cb534ad7bb8b5e34c7f553f0965ca4675c712abe206e393d051e7b526 Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.974919 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8z6vx"] Mar 10 14:05:14 crc kubenswrapper[4911]: I0310 14:05:14.978053 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:14 crc kubenswrapper[4911]: E0310 14:05:14.979899 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.479871503 +0000 UTC m=+220.043391570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.035562 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552524-98vcv"] Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.039277 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rbltz"] Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.079784 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.080251 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.58022967 +0000 UTC m=+220.143749577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: W0310 14:05:15.082921 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30365124_15de_458c_b8f8_97b0fab41da4.slice/crio-fc9e42fd5098d6176d2287923450c19fb71f879319c73b51ec3a78e6e695c00d WatchSource:0}: Error finding container fc9e42fd5098d6176d2287923450c19fb71f879319c73b51ec3a78e6e695c00d: Status 404 returned error can't find the container with id fc9e42fd5098d6176d2287923450c19fb71f879319c73b51ec3a78e6e695c00d Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.098342 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:05:15 crc kubenswrapper[4911]: W0310 14:05:15.115294 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e3bb0fc_9a7a_4008_9d8d_c84e19b65c02.slice/crio-cec8345f9f997a7569abdb07e834fd72d25d07fdc4f23fdb5ee7c20a1ed3f45a WatchSource:0}: Error finding container cec8345f9f997a7569abdb07e834fd72d25d07fdc4f23fdb5ee7c20a1ed3f45a: Status 404 returned error can't find the container with id cec8345f9f997a7569abdb07e834fd72d25d07fdc4f23fdb5ee7c20a1ed3f45a Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.181068 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.181964 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.68193699 +0000 UTC m=+220.245456927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.258697 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-4plxk" podStartSLOduration=184.258671468 podStartE2EDuration="3m4.258671468s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:15.258595316 +0000 UTC m=+219.822115233" watchObservedRunningTime="2026-03-10 14:05:15.258671468 +0000 UTC m=+219.822191385" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.282648 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.283116 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.783100496 +0000 UTC m=+220.346620413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.292224 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" podStartSLOduration=184.292198175 podStartE2EDuration="3m4.292198175s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:15.289003008 +0000 UTC m=+219.852522925" watchObservedRunningTime="2026-03-10 14:05:15.292198175 +0000 UTC m=+219.855718082" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.308205 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jjrkk" podStartSLOduration=5.30817215 podStartE2EDuration="5.30817215s" podCreationTimestamp="2026-03-10 14:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:15.307114204 +0000 UTC m=+219.870634121" watchObservedRunningTime="2026-03-10 14:05:15.30817215 +0000 UTC m=+219.871692067" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.368653 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-t9tlk" podStartSLOduration=184.368621225 podStartE2EDuration="3m4.368621225s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:15.359000224 +0000 UTC m=+219.922520171" watchObservedRunningTime="2026-03-10 14:05:15.368621225 +0000 UTC m=+219.932141142" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.384177 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.384739 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.884698922 +0000 UTC m=+220.448218839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.389441 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sgkhd" podStartSLOduration=184.389418046 podStartE2EDuration="3m4.389418046s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:15.388174046 +0000 UTC m=+219.951693963" watchObservedRunningTime="2026-03-10 14:05:15.389418046 +0000 UTC m=+219.952937953" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.485628 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.486331 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:15.98631248 +0000 UTC m=+220.549832397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.586804 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.587669 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.087630759 +0000 UTC m=+220.651150676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.690180 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.190160229 +0000 UTC m=+220.753680136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.689642 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.737131 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52196: no serving certificate available for the kubelet" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.793304 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.793806 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.293774984 +0000 UTC m=+220.857294901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.807559 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:15 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:15 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:15 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.807639 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.823804 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52198: no serving certificate available for the kubelet" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.904983 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:15 crc kubenswrapper[4911]: E0310 14:05:15.905529 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.405500654 +0000 UTC m=+220.969020751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.913276 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52204: no serving certificate available for the kubelet" Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.919686 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" event={"ID":"203f9b7d-fef8-4d09-96b9-8cfaa2e75901","Type":"ContainerStarted","Data":"f5ab376c07b11f4f9ef6195ff2280391927d61764108b56448800a0924cab5fd"} Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.926562 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" event={"ID":"4ca54095-1004-4c8c-a8bd-89a2c28fae76","Type":"ContainerStarted","Data":"3e85c84141b513daea57de3018311dc971c13dd91433897ed5eb577423fd09a7"} Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.926618 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" event={"ID":"4ca54095-1004-4c8c-a8bd-89a2c28fae76","Type":"ContainerStarted","Data":"718ee57effe01a045acc4948d3c0f87753de219eecdd4a9e9aefae5a9031e3e6"} Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.935758 4911 generic.go:334] "Generic (PLEG): container finished" podID="a409f0e6-8131-4025-b258-842a71a125b6" containerID="fed6a6979c2146e61fa9da0567605cb4a4182bdf2741195fedd25469d12e74db" exitCode=0 Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.935871 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" event={"ID":"a409f0e6-8131-4025-b258-842a71a125b6","Type":"ContainerDied","Data":"fed6a6979c2146e61fa9da0567605cb4a4182bdf2741195fedd25469d12e74db"} Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.966186 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" event={"ID":"a7ecb36c-d108-4771-bdc6-0d3489756695","Type":"ContainerStarted","Data":"bee2698c7ae71cfdc902a4e38521d8bb5eb7e209f219d58fa34c229e0da1340a"} Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.974545 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" event={"ID":"0fff7433-4b4b-457c-a985-3cd636960250","Type":"ContainerStarted","Data":"923c212e47834552fe79bce4687efb178e1b41f7c3902e648aa92e630ab77cef"} Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.984259 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" event={"ID":"de4371fc-62a4-4ef6-ab10-4d742f56b8de","Type":"ContainerStarted","Data":"3d2a91e67a99d24beab35396cb7a1a5b7618c74c988c4721e537ab67abb6fc25"} Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.987062 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" event={"ID":"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7","Type":"ContainerStarted","Data":"6391621fd12638a6bb655b8775db7ba4e6e7b76136265d7b9953f8897a8ed5b4"} Mar 10 14:05:15 crc kubenswrapper[4911]: I0310 14:05:15.992205 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nfhwf" event={"ID":"1ad00bf3-d146-4d5d-806e-fb340e3762bf","Type":"ContainerStarted","Data":"fc47842316b0c457cd8250f222ffc450768a14190e1f22262567be4ef3287661"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.006048 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.007531 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.507226934 +0000 UTC m=+221.070746851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.008699 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" event={"ID":"ff12c614-1348-42b6-ab73-75e3f89c20aa","Type":"ContainerStarted","Data":"1c4cdf9e0542affbe8aa5d13da2f3fa7c93ff7adfc110d7b2e10cea4946c7b9d"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.015152 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52210: no serving certificate available for the kubelet" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.019954 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" event={"ID":"96814f30-58d3-4501-bb95-185465b9df21","Type":"ContainerStarted","Data":"ad9b6c50c4b840392de763a83eb3b18ce0f272f278ff8cc8613b3685b918ad71"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.027538 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-zrr58" podStartSLOduration=185.027514213 podStartE2EDuration="3m5.027514213s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.026004476 +0000 UTC m=+220.589524393" watchObservedRunningTime="2026-03-10 14:05:16.027514213 +0000 UTC m=+220.591034130" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.030438 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" event={"ID":"c671ff01-6a3e-4626-8e8a-c78feb4b3491","Type":"ContainerStarted","Data":"8cdcc7bae993b3f68a55e3207f8d5621cebea889441e1d5be8231945c7f8e448"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.033848 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" event={"ID":"6e3c1bcf-b096-4526-8b7c-d304a5afa191","Type":"ContainerStarted","Data":"53da0232ea43af6b780be28c2a641408d2234ba85b2f9143f969de55933c408c"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.036289 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" event={"ID":"61dbbc3f-94f4-4c65-8c0b-7181159fcae3","Type":"ContainerStarted","Data":"1329f2429221c74b366c62c424863fdb51661664be121d48dca09fac3abfe0cd"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.043935 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8z6vx" event={"ID":"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7","Type":"ContainerStarted","Data":"cf8d72ce2295b83418a17221097a40b57609ded493144095ced737bdd94f9606"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.046233 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" event={"ID":"0d20f65c-3043-4310-83f1-300d0283f9b4","Type":"ContainerStarted","Data":"06fde4e3bfd3899571ef27f1d700ec6e54d75a1071ddfb599bdb42f85933d32a"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.046287 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" event={"ID":"0d20f65c-3043-4310-83f1-300d0283f9b4","Type":"ContainerStarted","Data":"736e5f7f64cbefe8f9391a8d0fa29f06494536b0b6af941be546ef62301d9d9c"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.049566 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" event={"ID":"2160b26f-7876-42d1-8d78-22f6f57cb08e","Type":"ContainerStarted","Data":"c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.050059 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.053205 4911 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7tfvh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.053249 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" podUID="2160b26f-7876-42d1-8d78-22f6f57cb08e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.056637 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" event={"ID":"aa0f80b6-3d76-4b00-a67c-e5ace1795e58","Type":"ContainerStarted","Data":"6a4c3e281e1e0f6dae41037c2664daf71afc44a48b0138b0ce0cd75a7f6b1c0b"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.065410 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5hb5z" event={"ID":"a081d982-3537-4193-9dcc-fa423bdd0fbc","Type":"ContainerStarted","Data":"d01eb53902e0785203663c98d678efe61bf5bf8dce264bc6ff964c2f8efdbe7c"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.070031 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" event={"ID":"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7","Type":"ContainerStarted","Data":"a4713ac8816352e8436c0052367e16fbe899ca284ef657194ebcacaff946a4e8"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.071328 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552524-98vcv" event={"ID":"30365124-15de-458c-b8f8-97b0fab41da4","Type":"ContainerStarted","Data":"fc9e42fd5098d6176d2287923450c19fb71f879319c73b51ec3a78e6e695c00d"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.072142 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" event={"ID":"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02","Type":"ContainerStarted","Data":"cec8345f9f997a7569abdb07e834fd72d25d07fdc4f23fdb5ee7c20a1ed3f45a"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.074700 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s5jkn" event={"ID":"39208142-b788-4b42-a0f2-421544f8833f","Type":"ContainerStarted","Data":"4950ec09b760cf36874df3c89d7e7a53359baf6f0642afa6335b965bbccfa60a"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.079153 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" podStartSLOduration=185.079105465 podStartE2EDuration="3m5.079105465s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.078352777 +0000 UTC m=+220.641872724" watchObservedRunningTime="2026-03-10 14:05:16.079105465 +0000 UTC m=+220.642625382" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.082675 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4tz6" podStartSLOduration=185.08263846 podStartE2EDuration="3m5.08263846s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.052933825 +0000 UTC m=+220.616453742" watchObservedRunningTime="2026-03-10 14:05:16.08263846 +0000 UTC m=+220.646158377" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.083721 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" event={"ID":"c33024d8-cc44-404d-91c1-f10018a0ffe4","Type":"ContainerStarted","Data":"52d3376a795d468ab07a92ba9d7e9489510516f242c41eca51096bf12f5ca422"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.087454 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" event={"ID":"596d1d14-4266-4409-87fb-0a155d8e69a4","Type":"ContainerStarted","Data":"be3f65a2e0c4e6338fc2b3ec7396796a6c49494d815cd6b0fcc0d33e20e25d9a"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.090802 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" event={"ID":"44e30f3b-b91b-42c7-9a50-35b8b08f2e25","Type":"ContainerStarted","Data":"736402875f660a1f078dfdf90eaedb64ee9a88c65414c22511093e111a1948f4"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.092426 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.094676 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" event={"ID":"826d5f3c-f7f7-404e-b5c5-6a33cde74892","Type":"ContainerStarted","Data":"3ae25f54928a5311f48f5d1e33b01d1fc820bb9aa708a14d2e98b1f59dbf49dc"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.099815 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" event={"ID":"2c79f6f9-05f1-44eb-8565-f0d888ee5163","Type":"ContainerStarted","Data":"bef5fe8c8069b3ca937f88cd5b28440272634a30a1794ce63f92eacccd825cf5"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.099879 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" event={"ID":"2c79f6f9-05f1-44eb-8565-f0d888ee5163","Type":"ContainerStarted","Data":"f723499b4a6541dad84cab6eada43935d61c658290a231858dd8c6cc4b9ac739"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.110082 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" podStartSLOduration=185.110062891 podStartE2EDuration="3m5.110062891s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.108900243 +0000 UTC m=+220.672420160" watchObservedRunningTime="2026-03-10 14:05:16.110062891 +0000 UTC m=+220.673582798" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.112126 4911 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-47624 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.112162 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" podUID="44e30f3b-b91b-42c7-9a50-35b8b08f2e25" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.116105 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.120540 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52214: no serving certificate available for the kubelet" Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.122978 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.622958741 +0000 UTC m=+221.186478658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.128174 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" event={"ID":"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37","Type":"ContainerStarted","Data":"d08deb3cb534ad7bb8b5e34c7f553f0965ca4675c712abe206e393d051e7b526"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.145119 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" event={"ID":"91baa02e-ad6d-425e-9be1-03972ce6d4ec","Type":"ContainerStarted","Data":"429d423e723728feee485f5b5cbd26e77d20b450a14ed8655a7d32245debf732"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.147243 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4s5dj" podStartSLOduration=185.147215005 podStartE2EDuration="3m5.147215005s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.146253612 +0000 UTC m=+220.709773529" watchObservedRunningTime="2026-03-10 14:05:16.147215005 +0000 UTC m=+220.710734912" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.157609 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" event={"ID":"f9d28727-b2f1-4e6a-83c8-c5e6f489e5a0","Type":"ContainerStarted","Data":"3cfc72e0b754d9acf68405790c86afc7fd1692583e882ab9c66f349459bcfbe3"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.179459 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" event={"ID":"2f1294ca-e6aa-4294-826d-3d37b345aea2","Type":"ContainerStarted","Data":"746c992df7b4d80558672dac24689fd61920e38dc462ab84d614d0a16d0d1b99"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.181429 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6j5j7" podStartSLOduration=185.181411449 podStartE2EDuration="3m5.181411449s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.180754253 +0000 UTC m=+220.744274170" watchObservedRunningTime="2026-03-10 14:05:16.181411449 +0000 UTC m=+220.744931366" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.214743 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v6zbd" podStartSLOduration=185.214704281 podStartE2EDuration="3m5.214704281s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.213616724 +0000 UTC m=+220.777136662" watchObservedRunningTime="2026-03-10 14:05:16.214704281 +0000 UTC m=+220.778224198" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.217379 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.218260 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.718242066 +0000 UTC m=+221.281761983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.238326 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lpjl7" event={"ID":"28f8f1c7-122d-47a4-8de8-90db75c3365b","Type":"ContainerStarted","Data":"bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.238380 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lpjl7" event={"ID":"28f8f1c7-122d-47a4-8de8-90db75c3365b","Type":"ContainerStarted","Data":"6b5a6cc00b38085d5ef888ddedb3a6abb462c75175925d945c102fb5bfce577b"} Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.239933 4911 patch_prober.go:28] interesting pod/console-operator-58897d9998-4plxk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.239976 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4plxk" podUID="4dd0502d-ca6f-45ff-8a4c-bfff96edaf04" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.248463 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" podStartSLOduration=185.248436473 podStartE2EDuration="3m5.248436473s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.247161222 +0000 UTC m=+220.810681139" watchObservedRunningTime="2026-03-10 14:05:16.248436473 +0000 UTC m=+220.811956390" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.283191 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52228: no serving certificate available for the kubelet" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.279895 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.314190 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzzd" podStartSLOduration=185.314170406 podStartE2EDuration="3m5.314170406s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.291055769 +0000 UTC m=+220.854575686" watchObservedRunningTime="2026-03-10 14:05:16.314170406 +0000 UTC m=+220.877690323" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.315384 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lpjl7" podStartSLOduration=185.315379395 podStartE2EDuration="3m5.315379395s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:16.313416848 +0000 UTC m=+220.876936765" watchObservedRunningTime="2026-03-10 14:05:16.315379395 +0000 UTC m=+220.878899312" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.320767 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.323224 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.823205804 +0000 UTC m=+221.386725721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.424783 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.425168 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.925125848 +0000 UTC m=+221.488645765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.425570 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.425902 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:16.925890166 +0000 UTC m=+221.489410083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.484337 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52238: no serving certificate available for the kubelet" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.544543 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.544905 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.044884592 +0000 UTC m=+221.608404509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.646143 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.646964 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.14695117 +0000 UTC m=+221.710471087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.749246 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.749500 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.249447498 +0000 UTC m=+221.812967425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.750009 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.750475 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.250449012 +0000 UTC m=+221.813968929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.801877 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:16 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:16 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:16 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.801948 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.851130 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.851756 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.351711101 +0000 UTC m=+221.915231018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.870446 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52250: no serving certificate available for the kubelet" Mar 10 14:05:16 crc kubenswrapper[4911]: I0310 14:05:16.953034 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:16 crc kubenswrapper[4911]: E0310 14:05:16.953388 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.453370019 +0000 UTC m=+222.016889936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.054312 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.054549 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.554504615 +0000 UTC m=+222.118024522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.054613 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.055030 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.555023007 +0000 UTC m=+222.118542924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.155906 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.156178 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.656133402 +0000 UTC m=+222.219653319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.156445 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.156948 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.656936832 +0000 UTC m=+222.220456929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.246078 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" event={"ID":"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7","Type":"ContainerStarted","Data":"9ef50cf0ae5b1f5afb76855250ec408adebdd16c7aa0659e5995495964b14c0e"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.246136 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" event={"ID":"c2c2a7e9-e34c-4dbf-81fa-945a91578cd7","Type":"ContainerStarted","Data":"92a39a3077099e30339b9ab8682ca53ed47e0dccdc2df4bdedc9841f225ead61"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.246224 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.254032 4911 generic.go:334] "Generic (PLEG): container finished" podID="3ed3ce1b-b217-44d0-8f24-57c9eb678ea7" containerID="95009321bb97cb9bb7eaec2abf82f9fff3e5b85c7c3282265ff5c0aae790957b" exitCode=0 Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.254150 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" event={"ID":"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7","Type":"ContainerDied","Data":"95009321bb97cb9bb7eaec2abf82f9fff3e5b85c7c3282265ff5c0aae790957b"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.257127 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.257319 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.757285078 +0000 UTC m=+222.320804995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.257551 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.257915 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.757899353 +0000 UTC m=+222.321419270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.258977 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" event={"ID":"a7ecb36c-d108-4771-bdc6-0d3489756695","Type":"ContainerStarted","Data":"9218e3784d7a5d41f36c6262030382560141cfeff81d82bd4f393efdfb3dbfff"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.261424 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" event={"ID":"0fff7433-4b4b-457c-a985-3cd636960250","Type":"ContainerStarted","Data":"c2b88c20d7e97c973e0306b3a666f4754b8f2574bf2724e086b509e4225dc775"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.265059 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" event={"ID":"c671ff01-6a3e-4626-8e8a-c78feb4b3491","Type":"ContainerStarted","Data":"612a2c6a9f76b871f96c7693600b88c3ea94506f4213cf0b5508c7263a155c63"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.275531 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5hb5z" event={"ID":"a081d982-3537-4193-9dcc-fa423bdd0fbc","Type":"ContainerStarted","Data":"0525a08fb0b79bfd6a4bcf3a1b5fb2962e5a8f15ba2647bbcd1714fa0afee240"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.280317 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" event={"ID":"596d1d14-4266-4409-87fb-0a155d8e69a4","Type":"ContainerStarted","Data":"961e387576ef510451444b5156ac07c52522298ddd1f383a2c709f5fc7f847b0"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.280394 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" event={"ID":"596d1d14-4266-4409-87fb-0a155d8e69a4","Type":"ContainerStarted","Data":"daff007130dbca4528de6b0314cec3205bf094c10f8bf3194dba030e2f7c72cf"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.298437 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" event={"ID":"a409f0e6-8131-4025-b258-842a71a125b6","Type":"ContainerStarted","Data":"2c6a26886545c2064191070e29acf46e4d8675a04c2d436015216d87b30094d5"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.301586 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" event={"ID":"de4371fc-62a4-4ef6-ab10-4d742f56b8de","Type":"ContainerStarted","Data":"f2a0436d888acde356be913538c8b5f793178167ea175bcd7f07406fafda7659"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.312074 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8z6vx" event={"ID":"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7","Type":"ContainerStarted","Data":"25d8129ec687b12a7090a812c97236b07764f96ac84adecdb91684c2c5f73514"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.312146 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8z6vx" event={"ID":"f4f3cef3-8cc4-42a5-84b6-f2c86f4535c7","Type":"ContainerStarted","Data":"03f7966483f001ab29856e94d786f03360456f3af5840003824be5e5c4c0083a"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.313143 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.313968 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-297k4" podStartSLOduration=186.313954422 podStartE2EDuration="3m6.313954422s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.313606123 +0000 UTC m=+221.877126050" watchObservedRunningTime="2026-03-10 14:05:17.313954422 +0000 UTC m=+221.877474339" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.316298 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" podStartSLOduration=186.316282058 podStartE2EDuration="3m6.316282058s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.266007487 +0000 UTC m=+221.829527404" watchObservedRunningTime="2026-03-10 14:05:17.316282058 +0000 UTC m=+221.879801975" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.340279 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" event={"ID":"96814f30-58d3-4501-bb95-185465b9df21","Type":"ContainerStarted","Data":"ec017f0d2df20a0bc0a0c4bd0ab3b9549a20172c0378410bd75f45385327d609"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.350518 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jfn5v" podStartSLOduration=186.350486902 podStartE2EDuration="3m6.350486902s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.347449358 +0000 UTC m=+221.910969285" watchObservedRunningTime="2026-03-10 14:05:17.350486902 +0000 UTC m=+221.914006819" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.359412 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.362848 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.86245519 +0000 UTC m=+222.425975117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.415402 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh5gn" podStartSLOduration=186.415377454 podStartE2EDuration="3m6.415377454s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.413894449 +0000 UTC m=+221.977414366" watchObservedRunningTime="2026-03-10 14:05:17.415377454 +0000 UTC m=+221.978897371" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.418237 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" event={"ID":"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02","Type":"ContainerStarted","Data":"cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.418849 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.420020 4911 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rbltz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.420075 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.437449 4911 generic.go:334] "Generic (PLEG): container finished" podID="4ca54095-1004-4c8c-a8bd-89a2c28fae76" containerID="3e85c84141b513daea57de3018311dc971c13dd91433897ed5eb577423fd09a7" exitCode=0 Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.437558 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" event={"ID":"4ca54095-1004-4c8c-a8bd-89a2c28fae76","Type":"ContainerDied","Data":"3e85c84141b513daea57de3018311dc971c13dd91433897ed5eb577423fd09a7"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.462103 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.462785 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:17.962703814 +0000 UTC m=+222.526223731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.469048 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" event={"ID":"203f9b7d-fef8-4d09-96b9-8cfaa2e75901","Type":"ContainerStarted","Data":"faec8da5e13620e0a56d0dbbd1264560eb5e62b21184ea9e498c286e1df737e3"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.475745 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" event={"ID":"2f1294ca-e6aa-4294-826d-3d37b345aea2","Type":"ContainerStarted","Data":"603a33ddb480f1157b4eb6c8b5c689a30c4abf92f97e0f4a63b01f1603e3aeaa"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.476899 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.499200 4911 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-kmlz4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.499266 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" podUID="2f1294ca-e6aa-4294-826d-3d37b345aea2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.503035 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nfhwf" event={"ID":"1ad00bf3-d146-4d5d-806e-fb340e3762bf","Type":"ContainerStarted","Data":"ff97ffb11a3306456555b324514234449f0eec82509900e46784181055caa665"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.504069 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.516688 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" event={"ID":"826d5f3c-f7f7-404e-b5c5-6a33cde74892","Type":"ContainerStarted","Data":"aa85f70efc055d7cec51dbbc7f548ae6d6bc8dca724adc3325dc17e9f5fcd46f"} Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.518469 4911 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-47624 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.518535 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" podUID="44e30f3b-b91b-42c7-9a50-35b8b08f2e25" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.519617 4911 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7tfvh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.519677 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" podUID="2160b26f-7876-42d1-8d78-22f6f57cb08e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.525234 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.525330 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.546864 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8z6vx" podStartSLOduration=7.54683341 podStartE2EDuration="7.54683341s" podCreationTimestamp="2026-03-10 14:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.492598354 +0000 UTC m=+222.056118271" watchObservedRunningTime="2026-03-10 14:05:17.54683341 +0000 UTC m=+222.110353327" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.549248 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-74p5x" podStartSLOduration=186.549234288 podStartE2EDuration="3m6.549234288s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.537473815 +0000 UTC m=+222.100993732" watchObservedRunningTime="2026-03-10 14:05:17.549234288 +0000 UTC m=+222.112754205" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.568823 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.569809 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.069786103 +0000 UTC m=+222.633306020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.575841 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52266: no serving certificate available for the kubelet" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.636262 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bsj4b" podStartSLOduration=186.636240393 podStartE2EDuration="3m6.636240393s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.589887277 +0000 UTC m=+222.153407224" watchObservedRunningTime="2026-03-10 14:05:17.636240393 +0000 UTC m=+222.199760310" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.672503 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.684279 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.18425885 +0000 UTC m=+222.747778767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.686663 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" podStartSLOduration=186.686640617 podStartE2EDuration="3m6.686640617s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.684595408 +0000 UTC m=+222.248115325" watchObservedRunningTime="2026-03-10 14:05:17.686640617 +0000 UTC m=+222.250160524" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.716063 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gwspk" podStartSLOduration=186.716037705 podStartE2EDuration="3m6.716037705s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.707886559 +0000 UTC m=+222.271406476" watchObservedRunningTime="2026-03-10 14:05:17.716037705 +0000 UTC m=+222.279557622" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.774258 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.775849 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.275827095 +0000 UTC m=+222.839347012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.812186 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:17 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:17 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:17 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.812273 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.822897 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5hb5z" podStartSLOduration=7.822876838 podStartE2EDuration="7.822876838s" podCreationTimestamp="2026-03-10 14:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.758644731 +0000 UTC m=+222.322164648" watchObservedRunningTime="2026-03-10 14:05:17.822876838 +0000 UTC m=+222.386396755" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.823643 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.823814 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" podStartSLOduration=186.82380919 podStartE2EDuration="3m6.82380919s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.819140728 +0000 UTC m=+222.382660645" watchObservedRunningTime="2026-03-10 14:05:17.82380919 +0000 UTC m=+222.387329097" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.823864 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.827453 4911 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-cqz22 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.827541 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" podUID="a409f0e6-8131-4025-b258-842a71a125b6" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.876273 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.877034 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.377019532 +0000 UTC m=+222.940539449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.891771 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wlwqw" podStartSLOduration=186.891739496 podStartE2EDuration="3m6.891739496s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.888891817 +0000 UTC m=+222.452411734" watchObservedRunningTime="2026-03-10 14:05:17.891739496 +0000 UTC m=+222.455259423" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.892528 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nfhwf" podStartSLOduration=186.892521015 podStartE2EDuration="3m6.892521015s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.862132373 +0000 UTC m=+222.425652280" watchObservedRunningTime="2026-03-10 14:05:17.892521015 +0000 UTC m=+222.456040932" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.919162 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" podStartSLOduration=186.919133976 podStartE2EDuration="3m6.919133976s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.919066914 +0000 UTC m=+222.482586831" watchObservedRunningTime="2026-03-10 14:05:17.919133976 +0000 UTC m=+222.482653893" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.988318 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8vtxn" podStartSLOduration=186.988277611 podStartE2EDuration="3m6.988277611s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:17.951928126 +0000 UTC m=+222.515448043" watchObservedRunningTime="2026-03-10 14:05:17.988277611 +0000 UTC m=+222.551797528" Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.989448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.989593 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.489574012 +0000 UTC m=+223.053093929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:17 crc kubenswrapper[4911]: I0310 14:05:17.989781 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:17 crc kubenswrapper[4911]: E0310 14:05:17.990230 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.490214417 +0000 UTC m=+223.053734334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.053010 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" podStartSLOduration=187.052986419 podStartE2EDuration="3m7.052986419s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:18.047770904 +0000 UTC m=+222.611290821" watchObservedRunningTime="2026-03-10 14:05:18.052986419 +0000 UTC m=+222.616506336" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.053499 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nl7jj" podStartSLOduration=187.053494281 podStartE2EDuration="3m7.053494281s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:18.020505437 +0000 UTC m=+222.584025354" watchObservedRunningTime="2026-03-10 14:05:18.053494281 +0000 UTC m=+222.617014198" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.091665 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.092054 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.59203812 +0000 UTC m=+223.155558037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.194590 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.195339 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.695323817 +0000 UTC m=+223.258843724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.298596 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.298859 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.798791419 +0000 UTC m=+223.362311336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.299268 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.299653 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.799637359 +0000 UTC m=+223.363157276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.402213 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.402441 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.902403434 +0000 UTC m=+223.465923351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.402664 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.403010 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:18.902995858 +0000 UTC m=+223.466515775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.504203 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.504688 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.004636986 +0000 UTC m=+223.568156903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.504960 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.505415 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.005407514 +0000 UTC m=+223.568927431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.520772 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.520847 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.542037 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" event={"ID":"4ca54095-1004-4c8c-a8bd-89a2c28fae76","Type":"ContainerStarted","Data":"e17fb5e87501a49326c5c77a6733f9c1c832ca6cd49d8ffab3f4760ce6995111"} Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.542899 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.555013 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" event={"ID":"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7","Type":"ContainerStarted","Data":"6a2b3c44018af5bde888ed230c56c96602ab719aec16d505d8a87ed2fe072ad5"} Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.555064 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" event={"ID":"3ed3ce1b-b217-44d0-8f24-57c9eb678ea7","Type":"ContainerStarted","Data":"46bcd47f3fc3d124f6ba79446426d68deaab5c438dccf2e62130c54165d992c4"} Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.570018 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" event={"ID":"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37","Type":"ContainerStarted","Data":"7018173447b15dcf2ad8b1b9ac4516a8097ec12edd80b0fc9fa1554ee87f55a8"} Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.571120 4911 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rbltz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.571184 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.571252 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.571332 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.574421 4911 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-kmlz4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.574478 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" podUID="2f1294ca-e6aa-4294-826d-3d37b345aea2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.619402 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.620814 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.120789123 +0000 UTC m=+223.684309050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.621819 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.647212 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47624" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.722466 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.723001 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.222974594 +0000 UTC m=+223.786494511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.738481 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.738924 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.741369 4911 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lcvpv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.43:8443/livez\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.741437 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" podUID="3ed3ce1b-b217-44d0-8f24-57c9eb678ea7" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.43:8443/livez\": dial tcp 10.217.0.43:8443: connect: connection refused" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.805445 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:18 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:18 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:18 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.805523 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.830218 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.830767 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.330743879 +0000 UTC m=+223.894263796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.911191 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" podStartSLOduration=187.911173066 podStartE2EDuration="3m7.911173066s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:18.673609495 +0000 UTC m=+223.237129412" watchObservedRunningTime="2026-03-10 14:05:18.911173066 +0000 UTC m=+223.474692983" Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.932187 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:18 crc kubenswrapper[4911]: E0310 14:05:18.932588 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.432574892 +0000 UTC m=+223.996094809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:18 crc kubenswrapper[4911]: I0310 14:05:18.956822 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52276: no serving certificate available for the kubelet" Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.033409 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.033911 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.533895792 +0000 UTC m=+224.097415709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.135714 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.136136 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.636123993 +0000 UTC m=+224.199643910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.225042 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.236908 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.237126 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.737081815 +0000 UTC m=+224.300601732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.237284 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.237782 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.737767871 +0000 UTC m=+224.301287968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.270382 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" podStartSLOduration=188.270357976 podStartE2EDuration="3m8.270357976s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:19.078963667 +0000 UTC m=+223.642483584" watchObservedRunningTime="2026-03-10 14:05:19.270357976 +0000 UTC m=+223.833877893" Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.338795 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.339260 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.839243025 +0000 UTC m=+224.402762942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.441176 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.441671 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:19.941655021 +0000 UTC m=+224.505174928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.542501 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.542963 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.04294026 +0000 UTC m=+224.606460177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.593742 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.593807 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.605861 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kmlz4" Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.644841 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.645258 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.145236794 +0000 UTC m=+224.708756711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.745616 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.745913 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.245869447 +0000 UTC m=+224.809389364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.746317 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.748090 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.24806343 +0000 UTC m=+224.811583547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.799776 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:19 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:19 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:19 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.799848 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.848090 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.848208 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.348185741 +0000 UTC m=+224.911705658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.848600 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.849061 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.349039102 +0000 UTC m=+224.912559019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.950120 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.950348 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.450311741 +0000 UTC m=+225.013831658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:19 crc kubenswrapper[4911]: I0310 14:05:19.950454 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:19 crc kubenswrapper[4911]: E0310 14:05:19.950845 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.450837323 +0000 UTC m=+225.014357230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.051581 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.051772 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.551735493 +0000 UTC m=+225.115255410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.051967 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.052371 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.552361408 +0000 UTC m=+225.115881515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.153351 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.153504 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.653479463 +0000 UTC m=+225.216999390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.153663 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.154097 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.654086018 +0000 UTC m=+225.217605935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.206818 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.207498 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.210548 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.211322 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.234203 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.255194 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.255387 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.755342426 +0000 UTC m=+225.318862353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.255573 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.255954 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.755944751 +0000 UTC m=+225.319464848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.356804 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.357126 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a076468-8db4-42a1-9780-77d667f8375a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3a076468-8db4-42a1-9780-77d667f8375a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.357223 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a076468-8db4-42a1-9780-77d667f8375a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3a076468-8db4-42a1-9780-77d667f8375a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.357358 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.857337183 +0000 UTC m=+225.420857110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.458901 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.458962 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a076468-8db4-42a1-9780-77d667f8375a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3a076468-8db4-42a1-9780-77d667f8375a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.459051 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a076468-8db4-42a1-9780-77d667f8375a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3a076468-8db4-42a1-9780-77d667f8375a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.459748 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:20.959710938 +0000 UTC m=+225.523230855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.459953 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a076468-8db4-42a1-9780-77d667f8375a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3a076468-8db4-42a1-9780-77d667f8375a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.502496 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a076468-8db4-42a1-9780-77d667f8375a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3a076468-8db4-42a1-9780-77d667f8375a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.531185 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.560353 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.560569 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.060523676 +0000 UTC m=+225.624043603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.560685 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.561066 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.061052158 +0000 UTC m=+225.624572075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.610477 4911 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-n2q7w container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.610545 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" podUID="4ca54095-1004-4c8c-a8bd-89a2c28fae76" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.661924 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.662411 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.162370048 +0000 UTC m=+225.725889965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.750246 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqkdl"] Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.751205 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" podUID="d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" containerName="controller-manager" containerID="cri-o://958f6b47ebb0ea654802512722c2af73cfbcfabdcf539e93b1f46d16ca3febc9" gracePeriod=30 Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.764433 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.766068 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.266045035 +0000 UTC m=+225.829565122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.799797 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q"] Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.800191 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" podUID="331e0c1d-7674-44da-bc5e-39358bbe9d07" containerName="route-controller-manager" containerID="cri-o://34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d" gracePeriod=30 Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.808825 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:20 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:20 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:20 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.808884 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.866395 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.866647 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.366602856 +0000 UTC m=+225.930122793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.866811 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.867173 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.367159079 +0000 UTC m=+225.930678996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.924389 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.968597 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:20 crc kubenswrapper[4911]: E0310 14:05:20.969554 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.469509204 +0000 UTC m=+226.033029141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:20 crc kubenswrapper[4911]: W0310 14:05:20.977264 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3a076468_8db4_42a1_9780_77d667f8375a.slice/crio-44a17d515d2c32bfa3a1d781b9c2f4669c3ba085d6d0f9ee42e45fcd3ef32a9c WatchSource:0}: Error finding container 44a17d515d2c32bfa3a1d781b9c2f4669c3ba085d6d0f9ee42e45fcd3ef32a9c: Status 404 returned error can't find the container with id 44a17d515d2c32bfa3a1d781b9c2f4669c3ba085d6d0f9ee42e45fcd3ef32a9c Mar 10 14:05:20 crc kubenswrapper[4911]: I0310 14:05:20.999153 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ttx9c"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.000226 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.002759 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.011119 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ttx9c"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.070132 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.070532 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.570512206 +0000 UTC m=+226.134032123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.107948 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w2clr"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.109005 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.112335 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.129367 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w2clr"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.171038 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.171256 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.671218311 +0000 UTC m=+226.234738228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.171384 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-catalog-content\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.171532 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpcqd\" (UniqueName: \"kubernetes.io/projected/9549430d-b06c-4c28-87bc-6320e73c31e5-kube-api-access-hpcqd\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.171574 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-utilities\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.171637 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.172015 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.67200638 +0000 UTC m=+226.235526297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.272550 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.272947 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-utilities\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.273028 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-catalog-content\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.273075 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wxf2\" (UniqueName: \"kubernetes.io/projected/e1796b5b-f2e1-4a7a-9463-039bb296626a-kube-api-access-7wxf2\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.273129 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpcqd\" (UniqueName: \"kubernetes.io/projected/9549430d-b06c-4c28-87bc-6320e73c31e5-kube-api-access-hpcqd\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.273164 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-utilities\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.273215 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-catalog-content\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.273365 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.773341651 +0000 UTC m=+226.336861568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.274295 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-catalog-content\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.274637 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-utilities\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.305178 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpcqd\" (UniqueName: \"kubernetes.io/projected/9549430d-b06c-4c28-87bc-6320e73c31e5-kube-api-access-hpcqd\") pod \"certified-operators-ttx9c\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.318327 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cvtlp"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.320866 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.332884 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cvtlp"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.337703 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.354197 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.389845 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-utilities\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.389951 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wxf2\" (UniqueName: \"kubernetes.io/projected/e1796b5b-f2e1-4a7a-9463-039bb296626a-kube-api-access-7wxf2\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.390040 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.390065 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-catalog-content\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.390415 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-utilities\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.390552 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-catalog-content\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.390877 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.890858911 +0000 UTC m=+226.454378828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.411277 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wxf2\" (UniqueName: \"kubernetes.io/projected/e1796b5b-f2e1-4a7a-9463-039bb296626a-kube-api-access-7wxf2\") pod \"community-operators-w2clr\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.429546 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.495934 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.496011 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4xsk\" (UniqueName: \"kubernetes.io/projected/331e0c1d-7674-44da-bc5e-39358bbe9d07-kube-api-access-l4xsk\") pod \"331e0c1d-7674-44da-bc5e-39358bbe9d07\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.496050 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331e0c1d-7674-44da-bc5e-39358bbe9d07-serving-cert\") pod \"331e0c1d-7674-44da-bc5e-39358bbe9d07\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.496117 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-client-ca\") pod \"331e0c1d-7674-44da-bc5e-39358bbe9d07\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.496176 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.996137776 +0000 UTC m=+226.559657693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.496227 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-config\") pod \"331e0c1d-7674-44da-bc5e-39358bbe9d07\" (UID: \"331e0c1d-7674-44da-bc5e-39358bbe9d07\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.496565 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-catalog-content\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.496700 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-utilities\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.496928 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.496997 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8jfr\" (UniqueName: \"kubernetes.io/projected/b986951d-80c8-4f06-a12b-9dd8047a7bf5-kube-api-access-c8jfr\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.497471 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-client-ca" (OuterVolumeSpecName: "client-ca") pod "331e0c1d-7674-44da-bc5e-39358bbe9d07" (UID: "331e0c1d-7674-44da-bc5e-39358bbe9d07"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.499855 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:21.999839195 +0000 UTC m=+226.563359312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.509805 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-config" (OuterVolumeSpecName: "config") pod "331e0c1d-7674-44da-bc5e-39358bbe9d07" (UID: "331e0c1d-7674-44da-bc5e-39358bbe9d07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.519558 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331e0c1d-7674-44da-bc5e-39358bbe9d07-kube-api-access-l4xsk" (OuterVolumeSpecName: "kube-api-access-l4xsk") pod "331e0c1d-7674-44da-bc5e-39358bbe9d07" (UID: "331e0c1d-7674-44da-bc5e-39358bbe9d07"). InnerVolumeSpecName "kube-api-access-l4xsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.524469 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331e0c1d-7674-44da-bc5e-39358bbe9d07-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "331e0c1d-7674-44da-bc5e-39358bbe9d07" (UID: "331e0c1d-7674-44da-bc5e-39358bbe9d07"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.530939 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9hlsw"] Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.531242 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331e0c1d-7674-44da-bc5e-39358bbe9d07" containerName="route-controller-manager" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.531255 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="331e0c1d-7674-44da-bc5e-39358bbe9d07" containerName="route-controller-manager" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.531368 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="331e0c1d-7674-44da-bc5e-39358bbe9d07" containerName="route-controller-manager" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.535001 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.546513 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hlsw"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.581124 4911 ???:1] "http: TLS handshake error from 192.168.126.11:52290: no serving certificate available for the kubelet" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.606420 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.606563 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-884ch\" (UniqueName: \"kubernetes.io/projected/713a2021-c450-41c7-b93d-ccee816a9820-kube-api-access-884ch\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.606600 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8jfr\" (UniqueName: \"kubernetes.io/projected/b986951d-80c8-4f06-a12b-9dd8047a7bf5-kube-api-access-c8jfr\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.606667 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.106641477 +0000 UTC m=+226.670161394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607067 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-catalog-content\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607096 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-catalog-content\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607324 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-utilities\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607342 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-utilities\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607377 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4xsk\" (UniqueName: \"kubernetes.io/projected/331e0c1d-7674-44da-bc5e-39358bbe9d07-kube-api-access-l4xsk\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607387 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/331e0c1d-7674-44da-bc5e-39358bbe9d07-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607398 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607406 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331e0c1d-7674-44da-bc5e-39358bbe9d07-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.607943 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-catalog-content\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.608382 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-utilities\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.630362 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ttx9c"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.657214 4911 generic.go:334] "Generic (PLEG): container finished" podID="d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" containerID="958f6b47ebb0ea654802512722c2af73cfbcfabdcf539e93b1f46d16ca3febc9" exitCode=0 Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.657290 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" event={"ID":"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d","Type":"ContainerDied","Data":"958f6b47ebb0ea654802512722c2af73cfbcfabdcf539e93b1f46d16ca3febc9"} Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.660489 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3a076468-8db4-42a1-9780-77d667f8375a","Type":"ContainerStarted","Data":"d5c1fe7c85e4be8a1bbbe42d84ef386b062615a2423f16834b9d7687c7a06899"} Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.660543 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3a076468-8db4-42a1-9780-77d667f8375a","Type":"ContainerStarted","Data":"44a17d515d2c32bfa3a1d781b9c2f4669c3ba085d6d0f9ee42e45fcd3ef32a9c"} Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.661859 4911 generic.go:334] "Generic (PLEG): container finished" podID="0d20f65c-3043-4310-83f1-300d0283f9b4" containerID="06fde4e3bfd3899571ef27f1d700ec6e54d75a1071ddfb599bdb42f85933d32a" exitCode=0 Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.661907 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" event={"ID":"0d20f65c-3043-4310-83f1-300d0283f9b4","Type":"ContainerDied","Data":"06fde4e3bfd3899571ef27f1d700ec6e54d75a1071ddfb599bdb42f85933d32a"} Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.664101 4911 generic.go:334] "Generic (PLEG): container finished" podID="331e0c1d-7674-44da-bc5e-39358bbe9d07" containerID="34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d" exitCode=0 Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.664158 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" event={"ID":"331e0c1d-7674-44da-bc5e-39358bbe9d07","Type":"ContainerDied","Data":"34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d"} Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.664191 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" event={"ID":"331e0c1d-7674-44da-bc5e-39358bbe9d07","Type":"ContainerDied","Data":"e6edfd041c49d60c9f106587281aacd3b06962808377846be3d0a00956e78ff6"} Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.664209 4911 scope.go:117] "RemoveContainer" containerID="34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.664377 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" Mar 10 14:05:21 crc kubenswrapper[4911]: W0310 14:05:21.676765 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9549430d_b06c_4c28_87bc_6320e73c31e5.slice/crio-6ed2cf93eeb9238c65b001dc3d85d6036dd2b16d91d95e9ce224d23435058959 WatchSource:0}: Error finding container 6ed2cf93eeb9238c65b001dc3d85d6036dd2b16d91d95e9ce224d23435058959: Status 404 returned error can't find the container with id 6ed2cf93eeb9238c65b001dc3d85d6036dd2b16d91d95e9ce224d23435058959 Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.685048 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8jfr\" (UniqueName: \"kubernetes.io/projected/b986951d-80c8-4f06-a12b-9dd8047a7bf5-kube-api-access-c8jfr\") pod \"certified-operators-cvtlp\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.712699 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-catalog-content\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.712772 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-utilities\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.712824 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-884ch\" (UniqueName: \"kubernetes.io/projected/713a2021-c450-41c7-b93d-ccee816a9820-kube-api-access-884ch\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.712845 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.713181 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.213167253 +0000 UTC m=+226.776687170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.713661 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-catalog-content\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.716378 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-utilities\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.729996 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.732424 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q"] Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.735048 4911 scope.go:117] "RemoveContainer" containerID="34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.736097 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d\": container with ID starting with 34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d not found: ID does not exist" containerID="34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.736150 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d"} err="failed to get container status \"34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d\": rpc error: code = NotFound desc = could not find container \"34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d\": container with ID starting with 34b3835db3d625b4445f1ead0e13f783e63cdab021a738048aaae716871d5b9d not found: ID does not exist" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.750626 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-884ch\" (UniqueName: \"kubernetes.io/projected/713a2021-c450-41c7-b93d-ccee816a9820-kube-api-access-884ch\") pod \"community-operators-9hlsw\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.809936 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.812060 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:21 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:21 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:21 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.812096 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.815362 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.816022 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.315986579 +0000 UTC m=+226.879506516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.816197 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.817356 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.317343581 +0000 UTC m=+226.880863518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.855149 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w2clr"] Mar 10 14:05:21 crc kubenswrapper[4911]: W0310 14:05:21.863347 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1796b5b_f2e1_4a7a_9463_039bb296626a.slice/crio-ef4db28afd25a9e8cd7c55eabfea72e0aaf4e86a99c96c684826fd1946b5bc00 WatchSource:0}: Error finding container ef4db28afd25a9e8cd7c55eabfea72e0aaf4e86a99c96c684826fd1946b5bc00: Status 404 returned error can't find the container with id ef4db28afd25a9e8cd7c55eabfea72e0aaf4e86a99c96c684826fd1946b5bc00 Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.887110 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.917491 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx5sx\" (UniqueName: \"kubernetes.io/projected/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-kube-api-access-hx5sx\") pod \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.920100 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-config\") pod \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.920275 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.920308 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-proxy-ca-bundles\") pod \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.920363 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-serving-cert\") pod \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.920450 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-client-ca\") pod \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\" (UID: \"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d\") " Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.921193 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" (UID: "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.921207 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-client-ca" (OuterVolumeSpecName: "client-ca") pod "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" (UID: "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: E0310 14:05:21.921300 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.421281514 +0000 UTC m=+226.984801431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.921435 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.921454 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.922003 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-config" (OuterVolumeSpecName: "config") pod "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" (UID: "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.925053 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-kube-api-access-hx5sx" (OuterVolumeSpecName: "kube-api-access-hx5sx") pod "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" (UID: "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d"). InnerVolumeSpecName "kube-api-access-hx5sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.925703 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" (UID: "d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:05:21 crc kubenswrapper[4911]: I0310 14:05:21.949592 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.023572 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.023664 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.023679 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx5sx\" (UniqueName: \"kubernetes.io/projected/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-kube-api-access-hx5sx\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.023689 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.023980 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.523962927 +0000 UTC m=+227.087482844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.125019 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.125333 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.625314008 +0000 UTC m=+227.188833925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.203449 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331e0c1d-7674-44da-bc5e-39358bbe9d07" path="/var/lib/kubelet/pods/331e0c1d-7674-44da-bc5e-39358bbe9d07/volumes" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.204208 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hlsw"] Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.226797 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.227152 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.72713855 +0000 UTC m=+227.290658457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.285223 4911 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ljh7q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.285352 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ljh7q" podUID="331e0c1d-7674-44da-bc5e-39358bbe9d07" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.328103 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.328644 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.828620414 +0000 UTC m=+227.392140341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.429942 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.430391 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:22.930374414 +0000 UTC m=+227.493894331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.494790 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cvtlp"] Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.510327 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n2q7w" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.531236 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.531447 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.031408628 +0000 UTC m=+227.594928555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.531657 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.532344 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.0323275 +0000 UTC m=+227.595847417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.633385 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.633618 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.133579988 +0000 UTC m=+227.697099915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.633775 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.634113 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.134098781 +0000 UTC m=+227.697618698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.652041 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4"] Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.653353 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" containerName="controller-manager" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.653392 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" containerName="controller-manager" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.653937 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" containerName="controller-manager" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.654934 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.655904 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69859c5bbf-gm992"] Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.660176 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.661310 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.661760 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.662314 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.663477 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.666616 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.669935 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.670373 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4plxk" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.671935 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4"] Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.702375 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvtlp" event={"ID":"b986951d-80c8-4f06-a12b-9dd8047a7bf5","Type":"ContainerStarted","Data":"1119858619d302b238be1211ea5698919224e2f18b66de4050bcf65b71480887"} Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.703498 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w2clr" event={"ID":"e1796b5b-f2e1-4a7a-9463-039bb296626a","Type":"ContainerStarted","Data":"ef4db28afd25a9e8cd7c55eabfea72e0aaf4e86a99c96c684826fd1946b5bc00"} Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.704934 4911 generic.go:334] "Generic (PLEG): container finished" podID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerID="b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb" exitCode=0 Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.705007 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttx9c" event={"ID":"9549430d-b06c-4c28-87bc-6320e73c31e5","Type":"ContainerDied","Data":"b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb"} Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.705030 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttx9c" event={"ID":"9549430d-b06c-4c28-87bc-6320e73c31e5","Type":"ContainerStarted","Data":"6ed2cf93eeb9238c65b001dc3d85d6036dd2b16d91d95e9ce224d23435058959"} Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.705366 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69859c5bbf-gm992"] Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.708605 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" event={"ID":"d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d","Type":"ContainerDied","Data":"dab234ec58546b342e8a901a5976470077920993c44ace14f24f5eb6ffd2f771"} Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.708645 4911 scope.go:117] "RemoveContainer" containerID="958f6b47ebb0ea654802512722c2af73cfbcfabdcf539e93b1f46d16ca3febc9" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.708810 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pqkdl" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.711879 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hlsw" event={"ID":"713a2021-c450-41c7-b93d-ccee816a9820","Type":"ContainerStarted","Data":"ffed09385d3b82fe6e7bb82bc007e54dcccde72716996c51881b682b6de0d5ca"} Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735256 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735458 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhpz\" (UniqueName: \"kubernetes.io/projected/60f3939e-4468-4458-b205-443507df1e5f-kube-api-access-qbhpz\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735507 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-config\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735528 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-client-ca\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735550 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-config\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735592 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-proxy-ca-bundles\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735708 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f3939e-4468-4458-b205-443507df1e5f-serving-cert\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735766 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8wnf\" (UniqueName: \"kubernetes.io/projected/02752ba7-8d45-4e41-a893-56a3b541dd63-kube-api-access-w8wnf\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735799 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02752ba7-8d45-4e41-a893-56a3b541dd63-serving-cert\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.735829 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-client-ca\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.735949 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.235927773 +0000 UTC m=+227.799447690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.764473 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.7644487 podStartE2EDuration="2.7644487s" podCreationTimestamp="2026-03-10 14:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:22.761113109 +0000 UTC m=+227.324633026" watchObservedRunningTime="2026-03-10 14:05:22.7644487 +0000 UTC m=+227.327968617" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.780047 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqkdl"] Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.785611 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqkdl"] Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.797138 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.799403 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:22 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:22 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:22 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.799444 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.829347 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.834852 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqz22" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.837510 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8wnf\" (UniqueName: \"kubernetes.io/projected/02752ba7-8d45-4e41-a893-56a3b541dd63-kube-api-access-w8wnf\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.837556 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02752ba7-8d45-4e41-a893-56a3b541dd63-serving-cert\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.837589 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-client-ca\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.837660 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-client-ca\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.837682 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhpz\" (UniqueName: \"kubernetes.io/projected/60f3939e-4468-4458-b205-443507df1e5f-kube-api-access-qbhpz\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.837765 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-config\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.837795 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-config\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.838750 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-client-ca\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.840734 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-config\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.840881 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-config\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.840984 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-proxy-ca-bundles\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.841246 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.841764 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.341750361 +0000 UTC m=+227.905270278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.841904 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f3939e-4468-4458-b205-443507df1e5f-serving-cert\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.843542 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-proxy-ca-bundles\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.845572 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-client-ca\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.847204 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f3939e-4468-4458-b205-443507df1e5f-serving-cert\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.847643 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02752ba7-8d45-4e41-a893-56a3b541dd63-serving-cert\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.878073 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8wnf\" (UniqueName: \"kubernetes.io/projected/02752ba7-8d45-4e41-a893-56a3b541dd63-kube-api-access-w8wnf\") pod \"controller-manager-69859c5bbf-gm992\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.878754 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhpz\" (UniqueName: \"kubernetes.io/projected/60f3939e-4468-4458-b205-443507df1e5f-kube-api-access-qbhpz\") pod \"route-controller-manager-fd5d64b4f-vq4x4\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.925235 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.925966 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.929981 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.930670 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.943445 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.943716 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faadc4e1-e90d-4a84-8268-f62704120a1a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"faadc4e1-e90d-4a84-8268-f62704120a1a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.943866 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faadc4e1-e90d-4a84-8268-f62704120a1a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"faadc4e1-e90d-4a84-8268-f62704120a1a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:22 crc kubenswrapper[4911]: E0310 14:05:22.944601 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.444582128 +0000 UTC m=+228.008102045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:22 crc kubenswrapper[4911]: I0310 14:05:22.951573 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.011302 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.032707 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.046653 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faadc4e1-e90d-4a84-8268-f62704120a1a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"faadc4e1-e90d-4a84-8268-f62704120a1a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.046868 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.046910 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faadc4e1-e90d-4a84-8268-f62704120a1a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"faadc4e1-e90d-4a84-8268-f62704120a1a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.049800 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faadc4e1-e90d-4a84-8268-f62704120a1a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"faadc4e1-e90d-4a84-8268-f62704120a1a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.050205 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.550182561 +0000 UTC m=+228.113702478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.068555 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faadc4e1-e90d-4a84-8268-f62704120a1a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"faadc4e1-e90d-4a84-8268-f62704120a1a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.111099 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vrd7f"] Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.112627 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.116842 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.130971 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrd7f"] Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.149325 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.149560 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-utilities\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.149623 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzwk4\" (UniqueName: \"kubernetes.io/projected/124dcf69-acd6-4d61-ab64-3cf0840df098-kube-api-access-wzwk4\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.149667 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-catalog-content\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.149805 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.649785349 +0000 UTC m=+228.213305266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.253976 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-utilities\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.254059 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzwk4\" (UniqueName: \"kubernetes.io/projected/124dcf69-acd6-4d61-ab64-3cf0840df098-kube-api-access-wzwk4\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.254109 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.254135 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-catalog-content\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.254669 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-utilities\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.254783 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-catalog-content\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.255057 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.755039874 +0000 UTC m=+228.318559791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.268300 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.276382 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzwk4\" (UniqueName: \"kubernetes.io/projected/124dcf69-acd6-4d61-ab64-3cf0840df098-kube-api-access-wzwk4\") pod \"redhat-marketplace-vrd7f\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.280456 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.355344 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.355462 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.855437572 +0000 UTC m=+228.418957489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.355515 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d20f65c-3043-4310-83f1-300d0283f9b4-secret-volume\") pod \"0d20f65c-3043-4310-83f1-300d0283f9b4\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.355546 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7j7x\" (UniqueName: \"kubernetes.io/projected/0d20f65c-3043-4310-83f1-300d0283f9b4-kube-api-access-s7j7x\") pod \"0d20f65c-3043-4310-83f1-300d0283f9b4\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.355572 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d20f65c-3043-4310-83f1-300d0283f9b4-config-volume\") pod \"0d20f65c-3043-4310-83f1-300d0283f9b4\" (UID: \"0d20f65c-3043-4310-83f1-300d0283f9b4\") " Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.355792 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.356430 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.856420055 +0000 UTC m=+228.419939972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.356797 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d20f65c-3043-4310-83f1-300d0283f9b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "0d20f65c-3043-4310-83f1-300d0283f9b4" (UID: "0d20f65c-3043-4310-83f1-300d0283f9b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.362875 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d20f65c-3043-4310-83f1-300d0283f9b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0d20f65c-3043-4310-83f1-300d0283f9b4" (UID: "0d20f65c-3043-4310-83f1-300d0283f9b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.363322 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d20f65c-3043-4310-83f1-300d0283f9b4-kube-api-access-s7j7x" (OuterVolumeSpecName: "kube-api-access-s7j7x") pod "0d20f65c-3043-4310-83f1-300d0283f9b4" (UID: "0d20f65c-3043-4310-83f1-300d0283f9b4"). InnerVolumeSpecName "kube-api-access-s7j7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.395232 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69859c5bbf-gm992"] Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.453515 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4"] Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.458712 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.459057 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d20f65c-3043-4310-83f1-300d0283f9b4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.459074 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7j7x\" (UniqueName: \"kubernetes.io/projected/0d20f65c-3043-4310-83f1-300d0283f9b4-kube-api-access-s7j7x\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.459084 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d20f65c-3043-4310-83f1-300d0283f9b4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.459151 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:23.959134199 +0000 UTC m=+228.522654116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.462107 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:23 crc kubenswrapper[4911]: W0310 14:05:23.464946 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f3939e_4468_4458_b205_443507df1e5f.slice/crio-471666a813605bf0ab55ccad952de8ffb2e282e1de74a0595dd560a823c3dcac WatchSource:0}: Error finding container 471666a813605bf0ab55ccad952de8ffb2e282e1de74a0595dd560a823c3dcac: Status 404 returned error can't find the container with id 471666a813605bf0ab55ccad952de8ffb2e282e1de74a0595dd560a823c3dcac Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.467810 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.472775 4911 patch_prober.go:28] interesting pod/console-f9d7485db-lpjl7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.472841 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lpjl7" podUID="28f8f1c7-122d-47a4-8de8-90db75c3365b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.475113 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.482986 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4sbh" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.509991 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-48nht"] Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.511233 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d20f65c-3043-4310-83f1-300d0283f9b4" containerName="collect-profiles" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.511254 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d20f65c-3043-4310-83f1-300d0283f9b4" containerName="collect-profiles" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.511403 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d20f65c-3043-4310-83f1-300d0283f9b4" containerName="collect-profiles" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.527250 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.550920 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48nht"] Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.564610 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjl4\" (UniqueName: \"kubernetes.io/projected/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-kube-api-access-pvjl4\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.565065 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-catalog-content\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.565155 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.565305 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-utilities\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.566569 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.066553966 +0000 UTC m=+228.630073883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.575069 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.668979 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.669443 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-utilities\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.669509 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjl4\" (UniqueName: \"kubernetes.io/projected/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-kube-api-access-pvjl4\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.669555 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-catalog-content\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.670428 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-catalog-content\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.670547 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.17051945 +0000 UTC m=+228.734039377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.673770 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-utilities\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.699819 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.706536 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.706606 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.706536 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.707075 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.709402 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjl4\" (UniqueName: \"kubernetes.io/projected/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-kube-api-access-pvjl4\") pod \"redhat-marketplace-48nht\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.733587 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" event={"ID":"02752ba7-8d45-4e41-a893-56a3b541dd63","Type":"ContainerStarted","Data":"561ec485a592a972a8e45390881eacbd5474f3bd360837c0823e5d217ef18b3f"} Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.735626 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" event={"ID":"60f3939e-4468-4458-b205-443507df1e5f","Type":"ContainerStarted","Data":"471666a813605bf0ab55ccad952de8ffb2e282e1de74a0595dd560a823c3dcac"} Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.738509 4911 generic.go:334] "Generic (PLEG): container finished" podID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerID="bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4" exitCode=0 Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.738579 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w2clr" event={"ID":"e1796b5b-f2e1-4a7a-9463-039bb296626a","Type":"ContainerDied","Data":"bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4"} Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.766781 4911 generic.go:334] "Generic (PLEG): container finished" podID="3a076468-8db4-42a1-9780-77d667f8375a" containerID="d5c1fe7c85e4be8a1bbbe42d84ef386b062615a2423f16834b9d7687c7a06899" exitCode=0 Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.766912 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3a076468-8db4-42a1-9780-77d667f8375a","Type":"ContainerDied","Data":"d5c1fe7c85e4be8a1bbbe42d84ef386b062615a2423f16834b9d7687c7a06899"} Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.769387 4911 generic.go:334] "Generic (PLEG): container finished" podID="713a2021-c450-41c7-b93d-ccee816a9820" containerID="cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b" exitCode=0 Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.769442 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hlsw" event={"ID":"713a2021-c450-41c7-b93d-ccee816a9820","Type":"ContainerDied","Data":"cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b"} Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.773554 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.773595 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.773974 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.273960631 +0000 UTC m=+228.837480548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.774456 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2" event={"ID":"0d20f65c-3043-4310-83f1-300d0283f9b4","Type":"ContainerDied","Data":"736e5f7f64cbefe8f9391a8d0fa29f06494536b0b6af941be546ef62301d9d9c"} Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.774519 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="736e5f7f64cbefe8f9391a8d0fa29f06494536b0b6af941be546ef62301d9d9c" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.776796 4911 generic.go:334] "Generic (PLEG): container finished" podID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerID="1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4" exitCode=0 Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.776837 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvtlp" event={"ID":"b986951d-80c8-4f06-a12b-9dd8047a7bf5","Type":"ContainerDied","Data":"1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4"} Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.799986 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.802645 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:23 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:23 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:23 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.802689 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.877513 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.880090 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.380035775 +0000 UTC m=+228.943555882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.953260 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:05:23 crc kubenswrapper[4911]: I0310 14:05:23.980622 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:23 crc kubenswrapper[4911]: E0310 14:05:23.981144 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.481119899 +0000 UTC m=+229.044639806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.084499 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.085095 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.585075153 +0000 UTC m=+229.148595070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.107691 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrd7f"] Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.114643 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-knb5d"] Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.116414 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: W0310 14:05:24.121540 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod124dcf69_acd6_4d61_ab64_3cf0840df098.slice/crio-090066ce5d6a66c35035dd3841a2af83c46b2c8cc0581e3c4f78e4146b3e2b8d WatchSource:0}: Error finding container 090066ce5d6a66c35035dd3841a2af83c46b2c8cc0581e3c4f78e4146b3e2b8d: Status 404 returned error can't find the container with id 090066ce5d6a66c35035dd3841a2af83c46b2c8cc0581e3c4f78e4146b3e2b8d Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.122344 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.124857 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knb5d"] Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.186860 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.187330 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-catalog-content\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.187401 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-utilities\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.187473 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddf7\" (UniqueName: \"kubernetes.io/projected/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-kube-api-access-kddf7\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.187962 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.68793645 +0000 UTC m=+229.251456567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.206032 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d" path="/var/lib/kubelet/pods/d6c1b4d2-0c1e-4cc5-8f1f-a3d724e5d62d/volumes" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.271190 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48nht"] Mar 10 14:05:24 crc kubenswrapper[4911]: W0310 14:05:24.282567 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod744c8ea6_3be6_496d_a6fc_002d3f0f95e4.slice/crio-6aa85add1472227f22140b3780ac28c2673f0e70ad9adb7552bb8c934137f1aa WatchSource:0}: Error finding container 6aa85add1472227f22140b3780ac28c2673f0e70ad9adb7552bb8c934137f1aa: Status 404 returned error can't find the container with id 6aa85add1472227f22140b3780ac28c2673f0e70ad9adb7552bb8c934137f1aa Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.288375 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.288569 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.788532463 +0000 UTC m=+229.352052390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.288800 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddf7\" (UniqueName: \"kubernetes.io/projected/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-kube-api-access-kddf7\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.288881 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.288945 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-catalog-content\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.288988 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-utilities\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.289460 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.789435804 +0000 UTC m=+229.352955721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.289568 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-catalog-content\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.321559 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-utilities\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.338857 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddf7\" (UniqueName: \"kubernetes.io/projected/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-kube-api-access-kddf7\") pod \"redhat-operators-knb5d\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.389916 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.390132 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.890097548 +0000 UTC m=+229.453617465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.390270 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.390642 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.890634351 +0000 UTC m=+229.454154268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.449351 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.491837 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.492305 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:24.992284918 +0000 UTC m=+229.555804835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.531362 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9pm6m"] Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.533852 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.559258 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9pm6m"] Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.603257 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-catalog-content\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.603304 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ml8k\" (UniqueName: \"kubernetes.io/projected/f11c754a-10a0-46ac-b171-5ccfecebdb7c-kube-api-access-8ml8k\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.603334 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-utilities\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.603373 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.603791 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.103776273 +0000 UTC m=+229.667296190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.704616 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.704847 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.204811436 +0000 UTC m=+229.768331353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.705030 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-catalog-content\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.705066 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ml8k\" (UniqueName: \"kubernetes.io/projected/f11c754a-10a0-46ac-b171-5ccfecebdb7c-kube-api-access-8ml8k\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.705094 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-utilities\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.705129 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.705491 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.205475782 +0000 UTC m=+229.768995699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.705993 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-utilities\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.706119 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-catalog-content\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.742155 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ml8k\" (UniqueName: \"kubernetes.io/projected/f11c754a-10a0-46ac-b171-5ccfecebdb7c-kube-api-access-8ml8k\") pod \"redhat-operators-9pm6m\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.802588 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:24 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:24 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:24 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.802649 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.806561 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.807007 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.306989547 +0000 UTC m=+229.870509454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.859810 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" event={"ID":"60f3939e-4468-4458-b205-443507df1e5f","Type":"ContainerStarted","Data":"3eec423ab2925d1af16f6de208f681e60f83213abdc5e5571cb113bf8317957b"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.860599 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.861925 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrd7f" event={"ID":"124dcf69-acd6-4d61-ab64-3cf0840df098","Type":"ContainerStarted","Data":"5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.861948 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrd7f" event={"ID":"124dcf69-acd6-4d61-ab64-3cf0840df098","Type":"ContainerStarted","Data":"090066ce5d6a66c35035dd3841a2af83c46b2c8cc0581e3c4f78e4146b3e2b8d"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.863744 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"faadc4e1-e90d-4a84-8268-f62704120a1a","Type":"ContainerStarted","Data":"833114b221f88ae94579be7fdd5c19d674f2bb05ded42b0f4b5a4cf9f5d8a477"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.863768 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"faadc4e1-e90d-4a84-8268-f62704120a1a","Type":"ContainerStarted","Data":"dc4ab158c03bb2736a5b2fe26592c020ddb4544909d8b568395550e86b4600f7"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.865130 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nht" event={"ID":"744c8ea6-3be6-496d-a6fc-002d3f0f95e4","Type":"ContainerStarted","Data":"a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.865154 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nht" event={"ID":"744c8ea6-3be6-496d-a6fc-002d3f0f95e4","Type":"ContainerStarted","Data":"6aa85add1472227f22140b3780ac28c2673f0e70ad9adb7552bb8c934137f1aa"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.866082 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" event={"ID":"02752ba7-8d45-4e41-a893-56a3b541dd63","Type":"ContainerStarted","Data":"7eb9cd67326b9e19ba44d30f604a9d3a8245864254cbf08a8b27d579908ee514"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.866621 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.866829 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.881284 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.900021 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" podStartSLOduration=3.899992297 podStartE2EDuration="3.899992297s" podCreationTimestamp="2026-03-10 14:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:24.898432479 +0000 UTC m=+229.461952396" watchObservedRunningTime="2026-03-10 14:05:24.899992297 +0000 UTC m=+229.463512214" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.908711 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:24 crc kubenswrapper[4911]: E0310 14:05:24.909923 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.409904095 +0000 UTC m=+229.973424002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.928437 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.931028 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" event={"ID":"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37","Type":"ContainerStarted","Data":"4ffb4a3fa738d978a6db16328f70adf61729a863e49593a43c940716e806e819"} Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.974570 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knb5d"] Mar 10 14:05:24 crc kubenswrapper[4911]: I0310 14:05:24.975928 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.975906775 podStartE2EDuration="2.975906775s" podCreationTimestamp="2026-03-10 14:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:24.967429861 +0000 UTC m=+229.530949788" watchObservedRunningTime="2026-03-10 14:05:24.975906775 +0000 UTC m=+229.539426682" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.004589 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" podStartSLOduration=4.004565385 podStartE2EDuration="4.004565385s" podCreationTimestamp="2026-03-10 14:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:25.004031832 +0000 UTC m=+229.567551759" watchObservedRunningTime="2026-03-10 14:05:25.004565385 +0000 UTC m=+229.568085302" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.010388 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:25 crc kubenswrapper[4911]: E0310 14:05:25.010961 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.510521018 +0000 UTC m=+230.074040935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.011105 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:25 crc kubenswrapper[4911]: E0310 14:05:25.011446 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.511439921 +0000 UTC m=+230.074959838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.113262 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:25 crc kubenswrapper[4911]: E0310 14:05:25.113810 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.613783515 +0000 UTC m=+230.177303432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.184776 4911 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lcvpv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]log ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]etcd ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/max-in-flight-filter ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 10 14:05:25 crc kubenswrapper[4911]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 10 14:05:25 crc kubenswrapper[4911]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/project.openshift.io-projectcache ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/openshift.io-startinformers ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 10 14:05:25 crc kubenswrapper[4911]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 14:05:25 crc kubenswrapper[4911]: livez check failed Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.185215 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" podUID="3ed3ce1b-b217-44d0-8f24-57c9eb678ea7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.217094 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:25 crc kubenswrapper[4911]: E0310 14:05:25.217517 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.717501793 +0000 UTC m=+230.281021710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.322837 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:25 crc kubenswrapper[4911]: E0310 14:05:25.323404 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.823382023 +0000 UTC m=+230.386901940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.375117 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9pm6m"] Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.425187 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:25 crc kubenswrapper[4911]: E0310 14:05:25.425716 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:25.925698297 +0000 UTC m=+230.489218214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.429239 4911 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.527034 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:25 crc kubenswrapper[4911]: E0310 14:05:25.527440 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 14:05:26.027421036 +0000 UTC m=+230.590940953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.628908 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:25 crc kubenswrapper[4911]: E0310 14:05:25.629262 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 14:05:26.129249419 +0000 UTC m=+230.692769336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m5pk6" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.629459 4911 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T14:05:25.429266413Z","Handler":null,"Name":""} Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.632174 4911 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.632200 4911 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.730576 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.734681 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.799211 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:25 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:25 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:25 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.799290 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.832234 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.835374 4911 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.835468 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.895954 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m5pk6\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.955916 4911 generic.go:334] "Generic (PLEG): container finished" podID="faadc4e1-e90d-4a84-8268-f62704120a1a" containerID="833114b221f88ae94579be7fdd5c19d674f2bb05ded42b0f4b5a4cf9f5d8a477" exitCode=0 Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.956044 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"faadc4e1-e90d-4a84-8268-f62704120a1a","Type":"ContainerDied","Data":"833114b221f88ae94579be7fdd5c19d674f2bb05ded42b0f4b5a4cf9f5d8a477"} Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.958062 4911 generic.go:334] "Generic (PLEG): container finished" podID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerID="a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e" exitCode=0 Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.958127 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nht" event={"ID":"744c8ea6-3be6-496d-a6fc-002d3f0f95e4","Type":"ContainerDied","Data":"a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e"} Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.962909 4911 generic.go:334] "Generic (PLEG): container finished" podID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerID="2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55" exitCode=0 Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.963005 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knb5d" event={"ID":"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9","Type":"ContainerDied","Data":"2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55"} Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.963059 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knb5d" event={"ID":"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9","Type":"ContainerStarted","Data":"f012d7ee8fc0b3720c3f33c14f0131dab4a817632807c1044cd5cec5c402b45c"} Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.965312 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" event={"ID":"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37","Type":"ContainerStarted","Data":"9c341c5e2c9bd0bd62b84358c84f0ee30c2cc5a2ef6539df6b038359ecd135a5"} Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.967003 4911 generic.go:334] "Generic (PLEG): container finished" podID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerID="5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787" exitCode=0 Mar 10 14:05:25 crc kubenswrapper[4911]: I0310 14:05:25.967057 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrd7f" event={"ID":"124dcf69-acd6-4d61-ab64-3cf0840df098","Type":"ContainerDied","Data":"5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787"} Mar 10 14:05:26 crc kubenswrapper[4911]: I0310 14:05:26.138351 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:26 crc kubenswrapper[4911]: I0310 14:05:26.209496 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 14:05:26 crc kubenswrapper[4911]: I0310 14:05:26.731244 4911 ???:1] "http: TLS handshake error from 192.168.126.11:51660: no serving certificate available for the kubelet" Mar 10 14:05:26 crc kubenswrapper[4911]: I0310 14:05:26.799717 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:26 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:26 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:26 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:26 crc kubenswrapper[4911]: I0310 14:05:26.799814 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:26 crc kubenswrapper[4911]: I0310 14:05:26.840301 4911 ???:1] "http: TLS handshake error from 192.168.126.11:51662: no serving certificate available for the kubelet" Mar 10 14:05:27 crc kubenswrapper[4911]: I0310 14:05:27.799354 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:27 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:27 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:27 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:27 crc kubenswrapper[4911]: I0310 14:05:27.799490 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:28 crc kubenswrapper[4911]: I0310 14:05:28.567162 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8z6vx" Mar 10 14:05:28 crc kubenswrapper[4911]: I0310 14:05:28.751447 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:28 crc kubenswrapper[4911]: I0310 14:05:28.759942 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lcvpv" Mar 10 14:05:28 crc kubenswrapper[4911]: I0310 14:05:28.798298 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:28 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:28 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:28 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:28 crc kubenswrapper[4911]: I0310 14:05:28.798347 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:29 crc kubenswrapper[4911]: I0310 14:05:29.799493 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:29 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:29 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:29 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:29 crc kubenswrapper[4911]: I0310 14:05:29.799601 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:29 crc kubenswrapper[4911]: I0310 14:05:29.907574 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.002654 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faadc4e1-e90d-4a84-8268-f62704120a1a-kube-api-access\") pod \"faadc4e1-e90d-4a84-8268-f62704120a1a\" (UID: \"faadc4e1-e90d-4a84-8268-f62704120a1a\") " Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.002758 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faadc4e1-e90d-4a84-8268-f62704120a1a-kubelet-dir\") pod \"faadc4e1-e90d-4a84-8268-f62704120a1a\" (UID: \"faadc4e1-e90d-4a84-8268-f62704120a1a\") " Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.003293 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/faadc4e1-e90d-4a84-8268-f62704120a1a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "faadc4e1-e90d-4a84-8268-f62704120a1a" (UID: "faadc4e1-e90d-4a84-8268-f62704120a1a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.003933 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"faadc4e1-e90d-4a84-8268-f62704120a1a","Type":"ContainerDied","Data":"dc4ab158c03bb2736a5b2fe26592c020ddb4544909d8b568395550e86b4600f7"} Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.003967 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc4ab158c03bb2736a5b2fe26592c020ddb4544909d8b568395550e86b4600f7" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.004040 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.018236 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faadc4e1-e90d-4a84-8268-f62704120a1a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "faadc4e1-e90d-4a84-8268-f62704120a1a" (UID: "faadc4e1-e90d-4a84-8268-f62704120a1a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.104550 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faadc4e1-e90d-4a84-8268-f62704120a1a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.104600 4911 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faadc4e1-e90d-4a84-8268-f62704120a1a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:30 crc kubenswrapper[4911]: W0310 14:05:30.748188 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11c754a_10a0_46ac_b171_5ccfecebdb7c.slice/crio-ba29199479e057e8f57a4a51909765322ae20b00593013ae73bd44b96e0423e8 WatchSource:0}: Error finding container ba29199479e057e8f57a4a51909765322ae20b00593013ae73bd44b96e0423e8: Status 404 returned error can't find the container with id ba29199479e057e8f57a4a51909765322ae20b00593013ae73bd44b96e0423e8 Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.780840 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.813741 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:30 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:30 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:30 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.813802 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.815232 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a076468-8db4-42a1-9780-77d667f8375a-kubelet-dir\") pod \"3a076468-8db4-42a1-9780-77d667f8375a\" (UID: \"3a076468-8db4-42a1-9780-77d667f8375a\") " Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.815343 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a076468-8db4-42a1-9780-77d667f8375a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3a076468-8db4-42a1-9780-77d667f8375a" (UID: "3a076468-8db4-42a1-9780-77d667f8375a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.815382 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a076468-8db4-42a1-9780-77d667f8375a-kube-api-access\") pod \"3a076468-8db4-42a1-9780-77d667f8375a\" (UID: \"3a076468-8db4-42a1-9780-77d667f8375a\") " Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.815704 4911 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a076468-8db4-42a1-9780-77d667f8375a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.819377 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a076468-8db4-42a1-9780-77d667f8375a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3a076468-8db4-42a1-9780-77d667f8375a" (UID: "3a076468-8db4-42a1-9780-77d667f8375a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:05:30 crc kubenswrapper[4911]: I0310 14:05:30.917040 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a076468-8db4-42a1-9780-77d667f8375a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:31 crc kubenswrapper[4911]: I0310 14:05:31.021679 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3a076468-8db4-42a1-9780-77d667f8375a","Type":"ContainerDied","Data":"44a17d515d2c32bfa3a1d781b9c2f4669c3ba085d6d0f9ee42e45fcd3ef32a9c"} Mar 10 14:05:31 crc kubenswrapper[4911]: I0310 14:05:31.021743 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a17d515d2c32bfa3a1d781b9c2f4669c3ba085d6d0f9ee42e45fcd3ef32a9c" Mar 10 14:05:31 crc kubenswrapper[4911]: I0310 14:05:31.021754 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 14:05:31 crc kubenswrapper[4911]: I0310 14:05:31.022932 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pm6m" event={"ID":"f11c754a-10a0-46ac-b171-5ccfecebdb7c","Type":"ContainerStarted","Data":"ba29199479e057e8f57a4a51909765322ae20b00593013ae73bd44b96e0423e8"} Mar 10 14:05:31 crc kubenswrapper[4911]: I0310 14:05:31.808325 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:31 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Mar 10 14:05:31 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:31 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:31 crc kubenswrapper[4911]: I0310 14:05:31.808668 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:32 crc kubenswrapper[4911]: I0310 14:05:32.801239 4911 patch_prober.go:28] interesting pod/router-default-5444994796-5c4nw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 14:05:32 crc kubenswrapper[4911]: [+]has-synced ok Mar 10 14:05:32 crc kubenswrapper[4911]: [+]process-running ok Mar 10 14:05:32 crc kubenswrapper[4911]: healthz check failed Mar 10 14:05:32 crc kubenswrapper[4911]: I0310 14:05:32.801312 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5c4nw" podUID="05d84958-d228-406a-9337-389f9a5f286d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 14:05:33 crc kubenswrapper[4911]: I0310 14:05:33.462753 4911 patch_prober.go:28] interesting pod/console-f9d7485db-lpjl7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 10 14:05:33 crc kubenswrapper[4911]: I0310 14:05:33.462825 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lpjl7" podUID="28f8f1c7-122d-47a4-8de8-90db75c3365b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 10 14:05:33 crc kubenswrapper[4911]: I0310 14:05:33.709120 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:33 crc kubenswrapper[4911]: I0310 14:05:33.709152 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:33 crc kubenswrapper[4911]: I0310 14:05:33.709224 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:33 crc kubenswrapper[4911]: I0310 14:05:33.709208 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:33 crc kubenswrapper[4911]: I0310 14:05:33.799844 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:33 crc kubenswrapper[4911]: I0310 14:05:33.802256 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5c4nw" Mar 10 14:05:34 crc kubenswrapper[4911]: I0310 14:05:34.302859 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m5pk6"] Mar 10 14:05:34 crc kubenswrapper[4911]: W0310 14:05:34.316893 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf89e6f0d_a78a_4543_9b03_ad1245748d9a.slice/crio-b1173d6c86d458decd5ed2b4e4160f6cabc5ee7814ad80dd6a02787e9970ea79 WatchSource:0}: Error finding container b1173d6c86d458decd5ed2b4e4160f6cabc5ee7814ad80dd6a02787e9970ea79: Status 404 returned error can't find the container with id b1173d6c86d458decd5ed2b4e4160f6cabc5ee7814ad80dd6a02787e9970ea79 Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.066401 4911 generic.go:334] "Generic (PLEG): container finished" podID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerID="c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5" exitCode=0 Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.066569 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pm6m" event={"ID":"f11c754a-10a0-46ac-b171-5ccfecebdb7c","Type":"ContainerDied","Data":"c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5"} Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.076310 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552524-98vcv" event={"ID":"30365124-15de-458c-b8f8-97b0fab41da4","Type":"ContainerStarted","Data":"32231f2264d17f7fca1c7041ef8ceac92288228832e5591cef79f082073ca2ed"} Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.099429 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" event={"ID":"f89e6f0d-a78a-4543-9b03-ad1245748d9a","Type":"ContainerStarted","Data":"231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a"} Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.099549 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" event={"ID":"f89e6f0d-a78a-4543-9b03-ad1245748d9a","Type":"ContainerStarted","Data":"b1173d6c86d458decd5ed2b4e4160f6cabc5ee7814ad80dd6a02787e9970ea79"} Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.100319 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.117339 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552524-98vcv" podStartSLOduration=76.001479326 podStartE2EDuration="1m35.117319159s" podCreationTimestamp="2026-03-10 14:04:00 +0000 UTC" firstStartedPulling="2026-03-10 14:05:15.098269005 +0000 UTC m=+219.661788922" lastFinishedPulling="2026-03-10 14:05:34.214108838 +0000 UTC m=+238.777628755" observedRunningTime="2026-03-10 14:05:35.114538282 +0000 UTC m=+239.678058209" watchObservedRunningTime="2026-03-10 14:05:35.117319159 +0000 UTC m=+239.680839076" Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.144239 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" podStartSLOduration=204.144211987 podStartE2EDuration="3m24.144211987s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:35.137901115 +0000 UTC m=+239.701421052" watchObservedRunningTime="2026-03-10 14:05:35.144211987 +0000 UTC m=+239.707731904" Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.147828 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" event={"ID":"ad0d7b23-d43a-4ee1-b3af-4bd65b7d8f37","Type":"ContainerStarted","Data":"2aa2de8e9bb9484ad8872cdd0c0b6d67ca03a52666c98130b4761fdad717fbbe"} Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.185823 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rmmf6" podStartSLOduration=25.185770317 podStartE2EDuration="25.185770317s" podCreationTimestamp="2026-03-10 14:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:05:35.181899524 +0000 UTC m=+239.745419441" watchObservedRunningTime="2026-03-10 14:05:35.185770317 +0000 UTC m=+239.749290254" Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.700375 4911 csr.go:261] certificate signing request csr-k27kt is approved, waiting to be issued Mar 10 14:05:35 crc kubenswrapper[4911]: I0310 14:05:35.710301 4911 csr.go:257] certificate signing request csr-k27kt is issued Mar 10 14:05:36 crc kubenswrapper[4911]: I0310 14:05:36.246186 4911 generic.go:334] "Generic (PLEG): container finished" podID="30365124-15de-458c-b8f8-97b0fab41da4" containerID="32231f2264d17f7fca1c7041ef8ceac92288228832e5591cef79f082073ca2ed" exitCode=0 Mar 10 14:05:36 crc kubenswrapper[4911]: I0310 14:05:36.246850 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552524-98vcv" event={"ID":"30365124-15de-458c-b8f8-97b0fab41da4","Type":"ContainerDied","Data":"32231f2264d17f7fca1c7041ef8ceac92288228832e5591cef79f082073ca2ed"} Mar 10 14:05:36 crc kubenswrapper[4911]: I0310 14:05:36.712062 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-10 16:45:52.313864465 +0000 UTC Mar 10 14:05:36 crc kubenswrapper[4911]: I0310 14:05:36.712107 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7346h40m15.601760471s for next certificate rotation Mar 10 14:05:37 crc kubenswrapper[4911]: I0310 14:05:37.714509 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-11 19:05:59.49693021 +0000 UTC Mar 10 14:05:37 crc kubenswrapper[4911]: I0310 14:05:37.714560 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5909h0m21.782373925s for next certificate rotation Mar 10 14:05:37 crc kubenswrapper[4911]: I0310 14:05:37.825748 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552524-98vcv" Mar 10 14:05:37 crc kubenswrapper[4911]: I0310 14:05:37.975336 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqdwt\" (UniqueName: \"kubernetes.io/projected/30365124-15de-458c-b8f8-97b0fab41da4-kube-api-access-hqdwt\") pod \"30365124-15de-458c-b8f8-97b0fab41da4\" (UID: \"30365124-15de-458c-b8f8-97b0fab41da4\") " Mar 10 14:05:37 crc kubenswrapper[4911]: I0310 14:05:37.991765 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30365124-15de-458c-b8f8-97b0fab41da4-kube-api-access-hqdwt" (OuterVolumeSpecName: "kube-api-access-hqdwt") pod "30365124-15de-458c-b8f8-97b0fab41da4" (UID: "30365124-15de-458c-b8f8-97b0fab41da4"). InnerVolumeSpecName "kube-api-access-hqdwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:05:38 crc kubenswrapper[4911]: I0310 14:05:38.078052 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqdwt\" (UniqueName: \"kubernetes.io/projected/30365124-15de-458c-b8f8-97b0fab41da4-kube-api-access-hqdwt\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:38 crc kubenswrapper[4911]: I0310 14:05:38.285473 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552524-98vcv" event={"ID":"30365124-15de-458c-b8f8-97b0fab41da4","Type":"ContainerDied","Data":"fc9e42fd5098d6176d2287923450c19fb71f879319c73b51ec3a78e6e695c00d"} Mar 10 14:05:38 crc kubenswrapper[4911]: I0310 14:05:38.285525 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9e42fd5098d6176d2287923450c19fb71f879319c73b51ec3a78e6e695c00d" Mar 10 14:05:38 crc kubenswrapper[4911]: I0310 14:05:38.285687 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552524-98vcv" Mar 10 14:05:40 crc kubenswrapper[4911]: I0310 14:05:40.355687 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69859c5bbf-gm992"] Mar 10 14:05:40 crc kubenswrapper[4911]: I0310 14:05:40.356280 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" podUID="02752ba7-8d45-4e41-a893-56a3b541dd63" containerName="controller-manager" containerID="cri-o://7eb9cd67326b9e19ba44d30f604a9d3a8245864254cbf08a8b27d579908ee514" gracePeriod=30 Mar 10 14:05:40 crc kubenswrapper[4911]: I0310 14:05:40.371518 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4"] Mar 10 14:05:40 crc kubenswrapper[4911]: I0310 14:05:40.371826 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" podUID="60f3939e-4468-4458-b205-443507df1e5f" containerName="route-controller-manager" containerID="cri-o://3eec423ab2925d1af16f6de208f681e60f83213abdc5e5571cb113bf8317957b" gracePeriod=30 Mar 10 14:05:41 crc kubenswrapper[4911]: I0310 14:05:41.353147 4911 generic.go:334] "Generic (PLEG): container finished" podID="02752ba7-8d45-4e41-a893-56a3b541dd63" containerID="7eb9cd67326b9e19ba44d30f604a9d3a8245864254cbf08a8b27d579908ee514" exitCode=0 Mar 10 14:05:41 crc kubenswrapper[4911]: I0310 14:05:41.353246 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" event={"ID":"02752ba7-8d45-4e41-a893-56a3b541dd63","Type":"ContainerDied","Data":"7eb9cd67326b9e19ba44d30f604a9d3a8245864254cbf08a8b27d579908ee514"} Mar 10 14:05:41 crc kubenswrapper[4911]: I0310 14:05:41.401318 4911 generic.go:334] "Generic (PLEG): container finished" podID="60f3939e-4468-4458-b205-443507df1e5f" containerID="3eec423ab2925d1af16f6de208f681e60f83213abdc5e5571cb113bf8317957b" exitCode=0 Mar 10 14:05:41 crc kubenswrapper[4911]: I0310 14:05:41.401386 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" event={"ID":"60f3939e-4468-4458-b205-443507df1e5f","Type":"ContainerDied","Data":"3eec423ab2925d1af16f6de208f681e60f83213abdc5e5571cb113bf8317957b"} Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.013146 4911 patch_prober.go:28] interesting pod/route-controller-manager-fd5d64b4f-vq4x4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.013259 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" podUID="60f3939e-4468-4458-b205-443507df1e5f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.033611 4911 patch_prober.go:28] interesting pod/controller-manager-69859c5bbf-gm992 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.033702 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" podUID="02752ba7-8d45-4e41-a893-56a3b541dd63" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.486175 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.490338 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.706382 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.706456 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.706469 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.706535 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.706567 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.707188 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.707233 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.707268 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"ff97ffb11a3306456555b324514234449f0eec82509900e46784181055caa665"} pod="openshift-console/downloads-7954f5f757-nfhwf" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 10 14:05:43 crc kubenswrapper[4911]: I0310 14:05:43.707309 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" containerID="cri-o://ff97ffb11a3306456555b324514234449f0eec82509900e46784181055caa665" gracePeriod=2 Mar 10 14:05:46 crc kubenswrapper[4911]: I0310 14:05:46.467397 4911 generic.go:334] "Generic (PLEG): container finished" podID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerID="ff97ffb11a3306456555b324514234449f0eec82509900e46784181055caa665" exitCode=0 Mar 10 14:05:46 crc kubenswrapper[4911]: I0310 14:05:46.467450 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nfhwf" event={"ID":"1ad00bf3-d146-4d5d-806e-fb340e3762bf","Type":"ContainerDied","Data":"ff97ffb11a3306456555b324514234449f0eec82509900e46784181055caa665"} Mar 10 14:05:48 crc kubenswrapper[4911]: I0310 14:05:48.521311 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:05:48 crc kubenswrapper[4911]: I0310 14:05:48.521503 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.707223 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.709917 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.738761 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-pmkwn" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.904898 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.909928 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942330 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd"] Mar 10 14:05:53 crc kubenswrapper[4911]: E0310 14:05:53.942626 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a076468-8db4-42a1-9780-77d667f8375a" containerName="pruner" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942640 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a076468-8db4-42a1-9780-77d667f8375a" containerName="pruner" Mar 10 14:05:53 crc kubenswrapper[4911]: E0310 14:05:53.942652 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f3939e-4468-4458-b205-443507df1e5f" containerName="route-controller-manager" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942659 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f3939e-4468-4458-b205-443507df1e5f" containerName="route-controller-manager" Mar 10 14:05:53 crc kubenswrapper[4911]: E0310 14:05:53.942671 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30365124-15de-458c-b8f8-97b0fab41da4" containerName="oc" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942677 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="30365124-15de-458c-b8f8-97b0fab41da4" containerName="oc" Mar 10 14:05:53 crc kubenswrapper[4911]: E0310 14:05:53.942686 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faadc4e1-e90d-4a84-8268-f62704120a1a" containerName="pruner" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942693 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="faadc4e1-e90d-4a84-8268-f62704120a1a" containerName="pruner" Mar 10 14:05:53 crc kubenswrapper[4911]: E0310 14:05:53.942702 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02752ba7-8d45-4e41-a893-56a3b541dd63" containerName="controller-manager" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942709 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="02752ba7-8d45-4e41-a893-56a3b541dd63" containerName="controller-manager" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942838 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a076468-8db4-42a1-9780-77d667f8375a" containerName="pruner" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942847 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f3939e-4468-4458-b205-443507df1e5f" containerName="route-controller-manager" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942858 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="02752ba7-8d45-4e41-a893-56a3b541dd63" containerName="controller-manager" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942866 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="30365124-15de-458c-b8f8-97b0fab41da4" containerName="oc" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.942878 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="faadc4e1-e90d-4a84-8268-f62704120a1a" containerName="pruner" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.943276 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:53 crc kubenswrapper[4911]: I0310 14:05:53.954853 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd"] Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.013007 4911 patch_prober.go:28] interesting pod/route-controller-manager-fd5d64b4f-vq4x4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.013098 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" podUID="60f3939e-4468-4458-b205-443507df1e5f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.033258 4911 patch_prober.go:28] interesting pod/controller-manager-69859c5bbf-gm992 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: i/o timeout" start-of-body= Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.033314 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" podUID="02752ba7-8d45-4e41-a893-56a3b541dd63" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: i/o timeout" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.073992 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f3939e-4468-4458-b205-443507df1e5f-serving-cert\") pod \"60f3939e-4468-4458-b205-443507df1e5f\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074056 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-config\") pod \"02752ba7-8d45-4e41-a893-56a3b541dd63\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074127 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8wnf\" (UniqueName: \"kubernetes.io/projected/02752ba7-8d45-4e41-a893-56a3b541dd63-kube-api-access-w8wnf\") pod \"02752ba7-8d45-4e41-a893-56a3b541dd63\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074152 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02752ba7-8d45-4e41-a893-56a3b541dd63-serving-cert\") pod \"02752ba7-8d45-4e41-a893-56a3b541dd63\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074234 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-client-ca\") pod \"60f3939e-4468-4458-b205-443507df1e5f\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074272 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-proxy-ca-bundles\") pod \"02752ba7-8d45-4e41-a893-56a3b541dd63\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074323 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-config\") pod \"60f3939e-4468-4458-b205-443507df1e5f\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074349 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbhpz\" (UniqueName: \"kubernetes.io/projected/60f3939e-4468-4458-b205-443507df1e5f-kube-api-access-qbhpz\") pod \"60f3939e-4468-4458-b205-443507df1e5f\" (UID: \"60f3939e-4468-4458-b205-443507df1e5f\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074393 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-client-ca\") pod \"02752ba7-8d45-4e41-a893-56a3b541dd63\" (UID: \"02752ba7-8d45-4e41-a893-56a3b541dd63\") " Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074584 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc5fj\" (UniqueName: \"kubernetes.io/projected/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-kube-api-access-jc5fj\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074711 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-config\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074795 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-client-ca\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.074837 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-serving-cert\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.075554 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-client-ca" (OuterVolumeSpecName: "client-ca") pod "60f3939e-4468-4458-b205-443507df1e5f" (UID: "60f3939e-4468-4458-b205-443507df1e5f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.075645 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "02752ba7-8d45-4e41-a893-56a3b541dd63" (UID: "02752ba7-8d45-4e41-a893-56a3b541dd63"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.076263 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-client-ca" (OuterVolumeSpecName: "client-ca") pod "02752ba7-8d45-4e41-a893-56a3b541dd63" (UID: "02752ba7-8d45-4e41-a893-56a3b541dd63"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.076448 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-config" (OuterVolumeSpecName: "config") pod "02752ba7-8d45-4e41-a893-56a3b541dd63" (UID: "02752ba7-8d45-4e41-a893-56a3b541dd63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.076460 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-config" (OuterVolumeSpecName: "config") pod "60f3939e-4468-4458-b205-443507df1e5f" (UID: "60f3939e-4468-4458-b205-443507df1e5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.085454 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f3939e-4468-4458-b205-443507df1e5f-kube-api-access-qbhpz" (OuterVolumeSpecName: "kube-api-access-qbhpz") pod "60f3939e-4468-4458-b205-443507df1e5f" (UID: "60f3939e-4468-4458-b205-443507df1e5f"). InnerVolumeSpecName "kube-api-access-qbhpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.085475 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f3939e-4468-4458-b205-443507df1e5f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60f3939e-4468-4458-b205-443507df1e5f" (UID: "60f3939e-4468-4458-b205-443507df1e5f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.085640 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02752ba7-8d45-4e41-a893-56a3b541dd63-kube-api-access-w8wnf" (OuterVolumeSpecName: "kube-api-access-w8wnf") pod "02752ba7-8d45-4e41-a893-56a3b541dd63" (UID: "02752ba7-8d45-4e41-a893-56a3b541dd63"). InnerVolumeSpecName "kube-api-access-w8wnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.085915 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02752ba7-8d45-4e41-a893-56a3b541dd63-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "02752ba7-8d45-4e41-a893-56a3b541dd63" (UID: "02752ba7-8d45-4e41-a893-56a3b541dd63"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175457 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-serving-cert\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175520 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc5fj\" (UniqueName: \"kubernetes.io/projected/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-kube-api-access-jc5fj\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175608 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-config\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175654 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-client-ca\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175697 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbhpz\" (UniqueName: \"kubernetes.io/projected/60f3939e-4468-4458-b205-443507df1e5f-kube-api-access-qbhpz\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175710 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175738 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f3939e-4468-4458-b205-443507df1e5f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175750 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175760 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8wnf\" (UniqueName: \"kubernetes.io/projected/02752ba7-8d45-4e41-a893-56a3b541dd63-kube-api-access-w8wnf\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175771 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02752ba7-8d45-4e41-a893-56a3b541dd63-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175783 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175792 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02752ba7-8d45-4e41-a893-56a3b541dd63-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.175802 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f3939e-4468-4458-b205-443507df1e5f-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.176758 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-client-ca\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.179199 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-config\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.181125 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-serving-cert\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.195752 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc5fj\" (UniqueName: \"kubernetes.io/projected/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-kube-api-access-jc5fj\") pod \"route-controller-manager-85cb9c8f9-wmwkd\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.267682 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.540653 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" event={"ID":"02752ba7-8d45-4e41-a893-56a3b541dd63","Type":"ContainerDied","Data":"561ec485a592a972a8e45390881eacbd5474f3bd360837c0823e5d217ef18b3f"} Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.540717 4911 scope.go:117] "RemoveContainer" containerID="7eb9cd67326b9e19ba44d30f604a9d3a8245864254cbf08a8b27d579908ee514" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.540717 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69859c5bbf-gm992" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.542027 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" event={"ID":"60f3939e-4468-4458-b205-443507df1e5f","Type":"ContainerDied","Data":"471666a813605bf0ab55ccad952de8ffb2e282e1de74a0595dd560a823c3dcac"} Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.542052 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4" Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.564382 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4"] Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.574114 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd5d64b4f-vq4x4"] Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.577705 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69859c5bbf-gm992"] Mar 10 14:05:54 crc kubenswrapper[4911]: I0310 14:05:54.580206 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69859c5bbf-gm992"] Mar 10 14:05:55 crc kubenswrapper[4911]: I0310 14:05:55.908925 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 14:05:55 crc kubenswrapper[4911]: I0310 14:05:55.909935 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:05:55 crc kubenswrapper[4911]: I0310 14:05:55.914453 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 14:05:55 crc kubenswrapper[4911]: I0310 14:05:55.915919 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 14:05:55 crc kubenswrapper[4911]: I0310 14:05:55.924762 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.104368 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3398d162-faa8-41bb-afc3-d1f8a645db17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3398d162-faa8-41bb-afc3-d1f8a645db17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.104422 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3398d162-faa8-41bb-afc3-d1f8a645db17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3398d162-faa8-41bb-afc3-d1f8a645db17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.144834 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.202934 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02752ba7-8d45-4e41-a893-56a3b541dd63" path="/var/lib/kubelet/pods/02752ba7-8d45-4e41-a893-56a3b541dd63/volumes" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.203463 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f3939e-4468-4458-b205-443507df1e5f" path="/var/lib/kubelet/pods/60f3939e-4468-4458-b205-443507df1e5f/volumes" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.206853 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3398d162-faa8-41bb-afc3-d1f8a645db17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3398d162-faa8-41bb-afc3-d1f8a645db17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.206903 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3398d162-faa8-41bb-afc3-d1f8a645db17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3398d162-faa8-41bb-afc3-d1f8a645db17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.207356 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3398d162-faa8-41bb-afc3-d1f8a645db17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3398d162-faa8-41bb-afc3-d1f8a645db17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.231973 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3398d162-faa8-41bb-afc3-d1f8a645db17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3398d162-faa8-41bb-afc3-d1f8a645db17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.248165 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.704553 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-688fb998b9-wjsrz"] Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.705321 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.707998 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.708158 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.708290 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.708398 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.708666 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.708826 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.726177 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.728022 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-688fb998b9-wjsrz"] Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.818627 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-config\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.818766 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f27a7f-2ef1-43b1-990a-c759b55431d8-serving-cert\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.818839 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kklm8\" (UniqueName: \"kubernetes.io/projected/44f27a7f-2ef1-43b1-990a-c759b55431d8-kube-api-access-kklm8\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.818876 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-client-ca\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.818897 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-proxy-ca-bundles\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.920232 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kklm8\" (UniqueName: \"kubernetes.io/projected/44f27a7f-2ef1-43b1-990a-c759b55431d8-kube-api-access-kklm8\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.920315 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-client-ca\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.920353 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-proxy-ca-bundles\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.920381 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-config\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.920445 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f27a7f-2ef1-43b1-990a-c759b55431d8-serving-cert\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.921594 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-client-ca\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.922285 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-proxy-ca-bundles\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.922552 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-config\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.927850 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f27a7f-2ef1-43b1-990a-c759b55431d8-serving-cert\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:56 crc kubenswrapper[4911]: I0310 14:05:56.937400 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kklm8\" (UniqueName: \"kubernetes.io/projected/44f27a7f-2ef1-43b1-990a-c759b55431d8-kube-api-access-kklm8\") pod \"controller-manager-688fb998b9-wjsrz\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:05:57 crc kubenswrapper[4911]: I0310 14:05:57.028304 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.149512 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552526-hfmpq"] Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.153068 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552526-hfmpq" Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.157306 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.157629 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.157818 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.161271 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552526-hfmpq"] Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.273582 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrql\" (UniqueName: \"kubernetes.io/projected/d90ef4fc-8e97-468a-b0ba-d7105067b50c-kube-api-access-jvrql\") pod \"auto-csr-approver-29552526-hfmpq\" (UID: \"d90ef4fc-8e97-468a-b0ba-d7105067b50c\") " pod="openshift-infra/auto-csr-approver-29552526-hfmpq" Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.338393 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-688fb998b9-wjsrz"] Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.375661 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrql\" (UniqueName: \"kubernetes.io/projected/d90ef4fc-8e97-468a-b0ba-d7105067b50c-kube-api-access-jvrql\") pod \"auto-csr-approver-29552526-hfmpq\" (UID: \"d90ef4fc-8e97-468a-b0ba-d7105067b50c\") " pod="openshift-infra/auto-csr-approver-29552526-hfmpq" Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.412877 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrql\" (UniqueName: \"kubernetes.io/projected/d90ef4fc-8e97-468a-b0ba-d7105067b50c-kube-api-access-jvrql\") pod \"auto-csr-approver-29552526-hfmpq\" (UID: \"d90ef4fc-8e97-468a-b0ba-d7105067b50c\") " pod="openshift-infra/auto-csr-approver-29552526-hfmpq" Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.429954 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd"] Mar 10 14:06:00 crc kubenswrapper[4911]: I0310 14:06:00.481888 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552526-hfmpq" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.505888 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.506831 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.525310 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.609193 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.609370 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-var-lock\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.609530 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7365bf68-b4d4-4df4-8e73-6c2336e58792-kube-api-access\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.711216 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-var-lock\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.711364 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7365bf68-b4d4-4df4-8e73-6c2336e58792-kube-api-access\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.711395 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.711516 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.711814 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-var-lock\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.734934 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7365bf68-b4d4-4df4-8e73-6c2336e58792-kube-api-access\") pod \"installer-9-crc\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:01 crc kubenswrapper[4911]: I0310 14:06:01.841047 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:02 crc kubenswrapper[4911]: E0310 14:06:02.629963 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 14:06:02 crc kubenswrapper[4911]: E0310 14:06:02.630186 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pvjl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-48nht_openshift-marketplace(744c8ea6-3be6-496d-a6fc-002d3f0f95e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 14:06:02 crc kubenswrapper[4911]: E0310 14:06:02.631417 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-48nht" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" Mar 10 14:06:03 crc kubenswrapper[4911]: I0310 14:06:03.706313 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:06:03 crc kubenswrapper[4911]: I0310 14:06:03.706386 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:06:04 crc kubenswrapper[4911]: E0310 14:06:04.990011 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-48nht" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" Mar 10 14:06:05 crc kubenswrapper[4911]: E0310 14:06:05.061680 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 14:06:05 crc kubenswrapper[4911]: E0310 14:06:05.061976 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hpcqd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ttx9c_openshift-marketplace(9549430d-b06c-4c28-87bc-6320e73c31e5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 14:06:05 crc kubenswrapper[4911]: E0310 14:06:05.063276 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ttx9c" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" Mar 10 14:06:05 crc kubenswrapper[4911]: E0310 14:06:05.067991 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 14:06:05 crc kubenswrapper[4911]: E0310 14:06:05.068116 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8jfr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cvtlp_openshift-marketplace(b986951d-80c8-4f06-a12b-9dd8047a7bf5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 14:06:05 crc kubenswrapper[4911]: E0310 14:06:05.069337 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cvtlp" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.475627 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cvtlp" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" Mar 10 14:06:06 crc kubenswrapper[4911]: I0310 14:06:06.538137 4911 scope.go:117] "RemoveContainer" containerID="3eec423ab2925d1af16f6de208f681e60f83213abdc5e5571cb113bf8317957b" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.550213 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.550395 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-884ch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9hlsw_openshift-marketplace(713a2021-c450-41c7-b93d-ccee816a9820): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.552442 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9hlsw" podUID="713a2021-c450-41c7-b93d-ccee816a9820" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.584992 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.585181 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wxf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-w2clr_openshift-marketplace(e1796b5b-f2e1-4a7a-9463-039bb296626a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.586266 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-w2clr" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.587080 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.587284 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wzwk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vrd7f_openshift-marketplace(124dcf69-acd6-4d61-ab64-3cf0840df098): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.588852 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vrd7f" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.634212 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vrd7f" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.634513 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9hlsw" podUID="713a2021-c450-41c7-b93d-ccee816a9820" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.634617 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-w2clr" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.665375 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.665532 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8ml8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9pm6m_openshift-marketplace(f11c754a-10a0-46ac-b171-5ccfecebdb7c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 14:06:06 crc kubenswrapper[4911]: E0310 14:06:06.666602 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9pm6m" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" Mar 10 14:06:06 crc kubenswrapper[4911]: I0310 14:06:06.846457 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd"] Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.049546 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 14:06:07 crc kubenswrapper[4911]: W0310 14:06:07.061808 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7365bf68_b4d4_4df4_8e73_6c2336e58792.slice/crio-ed4f5bd1e7d618ef68e89e456fe84b9466f7f2b0af140fa16b9e3e3251763702 WatchSource:0}: Error finding container ed4f5bd1e7d618ef68e89e456fe84b9466f7f2b0af140fa16b9e3e3251763702: Status 404 returned error can't find the container with id ed4f5bd1e7d618ef68e89e456fe84b9466f7f2b0af140fa16b9e3e3251763702 Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.191280 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-688fb998b9-wjsrz"] Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.194582 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552526-hfmpq"] Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.199668 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 14:06:07 crc kubenswrapper[4911]: W0310 14:06:07.209220 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd90ef4fc_8e97_468a_b0ba_d7105067b50c.slice/crio-d45380be9f2deb138a83754cc5c267ad1042f92666f6ebefb2bca4ee167bdeeb WatchSource:0}: Error finding container d45380be9f2deb138a83754cc5c267ad1042f92666f6ebefb2bca4ee167bdeeb: Status 404 returned error can't find the container with id d45380be9f2deb138a83754cc5c267ad1042f92666f6ebefb2bca4ee167bdeeb Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.642744 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3398d162-faa8-41bb-afc3-d1f8a645db17","Type":"ContainerStarted","Data":"8a13544b1c35ddd65dd9b66528822c08891cd24f23b98928dcdd0c17967a140a"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.646429 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" event={"ID":"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4","Type":"ContainerStarted","Data":"c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.646465 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" event={"ID":"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4","Type":"ContainerStarted","Data":"6245298c3bf0bb25180ee5a72173832730ec1a56d410d3cd1fa17f38f5228a79"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.646978 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" podUID="c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" containerName="route-controller-manager" containerID="cri-o://c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6" gracePeriod=30 Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.647155 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.652088 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7365bf68-b4d4-4df4-8e73-6c2336e58792","Type":"ContainerStarted","Data":"ed4f5bd1e7d618ef68e89e456fe84b9466f7f2b0af140fa16b9e3e3251763702"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.654213 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552526-hfmpq" event={"ID":"d90ef4fc-8e97-468a-b0ba-d7105067b50c","Type":"ContainerStarted","Data":"d45380be9f2deb138a83754cc5c267ad1042f92666f6ebefb2bca4ee167bdeeb"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.656586 4911 generic.go:334] "Generic (PLEG): container finished" podID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerID="501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd" exitCode=0 Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.656662 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knb5d" event={"ID":"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9","Type":"ContainerDied","Data":"501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.665163 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" podStartSLOduration=27.665144569 podStartE2EDuration="27.665144569s" podCreationTimestamp="2026-03-10 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:06:07.662856494 +0000 UTC m=+272.226376411" watchObservedRunningTime="2026-03-10 14:06:07.665144569 +0000 UTC m=+272.228664486" Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.668933 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" event={"ID":"44f27a7f-2ef1-43b1-990a-c759b55431d8","Type":"ContainerStarted","Data":"17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.668969 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" event={"ID":"44f27a7f-2ef1-43b1-990a-c759b55431d8","Type":"ContainerStarted","Data":"7d47faa62f50e3a56a37baf184423a61001b1514190ddf53b7e9c35f2045a516"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.671968 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nfhwf" event={"ID":"1ad00bf3-d146-4d5d-806e-fb340e3762bf","Type":"ContainerStarted","Data":"bc3eb4efceddbc1f51a49e639f8dc853e448303efb9caa6d25b5c0d6a1180082"} Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.672748 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:06:07 crc kubenswrapper[4911]: E0310 14:06:07.673597 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9pm6m" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.673867 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.673897 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.711901 4911 patch_prober.go:28] interesting pod/route-controller-manager-85cb9c8f9-wmwkd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:35426->10.217.0.57:8443: read: connection reset by peer" start-of-body= Mar 10 14:06:07 crc kubenswrapper[4911]: I0310 14:06:07.711981 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" podUID="c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:35426->10.217.0.57:8443: read: connection reset by peer" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.654266 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-85cb9c8f9-wmwkd_c4b1c8f2-8cc8-42e2-95f2-523c53b742f4/route-controller-manager/0.log" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.654809 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.684333 4911 generic.go:334] "Generic (PLEG): container finished" podID="3398d162-faa8-41bb-afc3-d1f8a645db17" containerID="c1b137afb2b5f9fc7e097f6c1cb846de972211ab38cc5234c2b27a734536584b" exitCode=0 Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.684462 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3398d162-faa8-41bb-afc3-d1f8a645db17","Type":"ContainerDied","Data":"c1b137afb2b5f9fc7e097f6c1cb846de972211ab38cc5234c2b27a734536584b"} Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.688251 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-85cb9c8f9-wmwkd_c4b1c8f2-8cc8-42e2-95f2-523c53b742f4/route-controller-manager/0.log" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.688296 4911 generic.go:334] "Generic (PLEG): container finished" podID="c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" containerID="c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6" exitCode=255 Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.688393 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" event={"ID":"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4","Type":"ContainerDied","Data":"c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6"} Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.688424 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" event={"ID":"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4","Type":"ContainerDied","Data":"6245298c3bf0bb25180ee5a72173832730ec1a56d410d3cd1fa17f38f5228a79"} Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.688453 4911 scope.go:117] "RemoveContainer" containerID="c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.688827 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.691456 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw"] Mar 10 14:06:08 crc kubenswrapper[4911]: E0310 14:06:08.691867 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" containerName="route-controller-manager" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.691898 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" containerName="route-controller-manager" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.692048 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" containerName="route-controller-manager" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.692494 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.694268 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7365bf68-b4d4-4df4-8e73-6c2336e58792","Type":"ContainerStarted","Data":"efbeefe26f2d4b1d8d43736ff1c2fe5470a3df1ce5c40ec70540d9ab83dd5cfe"} Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.694896 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" podUID="44f27a7f-2ef1-43b1-990a-c759b55431d8" containerName="controller-manager" containerID="cri-o://17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809" gracePeriod=30 Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.695276 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.695351 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.714305 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw"] Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.739421 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-client-ca\") pod \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.739543 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-serving-cert\") pod \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.739611 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-config\") pod \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.739654 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc5fj\" (UniqueName: \"kubernetes.io/projected/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-kube-api-access-jc5fj\") pod \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\" (UID: \"c4b1c8f2-8cc8-42e2-95f2-523c53b742f4\") " Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.739891 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m8qr\" (UniqueName: \"kubernetes.io/projected/63fc9dce-4616-49b0-bfec-62499b691d1a-kube-api-access-9m8qr\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.739935 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc9dce-4616-49b0-bfec-62499b691d1a-serving-cert\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.739968 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-config\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.739996 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-client-ca\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.740925 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-client-ca" (OuterVolumeSpecName: "client-ca") pod "c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" (UID: "c4b1c8f2-8cc8-42e2-95f2-523c53b742f4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.741744 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-config" (OuterVolumeSpecName: "config") pod "c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" (UID: "c4b1c8f2-8cc8-42e2-95f2-523c53b742f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.758419 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" (UID: "c4b1c8f2-8cc8-42e2-95f2-523c53b742f4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.760587 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-kube-api-access-jc5fj" (OuterVolumeSpecName: "kube-api-access-jc5fj") pod "c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" (UID: "c4b1c8f2-8cc8-42e2-95f2-523c53b742f4"). InnerVolumeSpecName "kube-api-access-jc5fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.764561 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" podStartSLOduration=28.764534901 podStartE2EDuration="28.764534901s" podCreationTimestamp="2026-03-10 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:06:08.761335954 +0000 UTC m=+273.324855881" watchObservedRunningTime="2026-03-10 14:06:08.764534901 +0000 UTC m=+273.328054818" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.791492 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.79145104 podStartE2EDuration="7.79145104s" podCreationTimestamp="2026-03-10 14:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:06:08.789253957 +0000 UTC m=+273.352773874" watchObservedRunningTime="2026-03-10 14:06:08.79145104 +0000 UTC m=+273.354970957" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.840996 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m8qr\" (UniqueName: \"kubernetes.io/projected/63fc9dce-4616-49b0-bfec-62499b691d1a-kube-api-access-9m8qr\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.841061 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc9dce-4616-49b0-bfec-62499b691d1a-serving-cert\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.841094 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-config\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.841124 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-client-ca\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.841178 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.841192 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.841202 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc5fj\" (UniqueName: \"kubernetes.io/projected/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-kube-api-access-jc5fj\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.841211 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.842622 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-client-ca\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.844515 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-config\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.848596 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc9dce-4616-49b0-bfec-62499b691d1a-serving-cert\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.869275 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m8qr\" (UniqueName: \"kubernetes.io/projected/63fc9dce-4616-49b0-bfec-62499b691d1a-kube-api-access-9m8qr\") pod \"route-controller-manager-86786b9875-7z6nw\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.976079 4911 scope.go:117] "RemoveContainer" containerID="c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6" Mar 10 14:06:08 crc kubenswrapper[4911]: E0310 14:06:08.976613 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6\": container with ID starting with c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6 not found: ID does not exist" containerID="c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6" Mar 10 14:06:08 crc kubenswrapper[4911]: I0310 14:06:08.976649 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6"} err="failed to get container status \"c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6\": rpc error: code = NotFound desc = could not find container \"c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6\": container with ID starting with c26f05a4820c05f918478541edfd2aa1b62e910f19f8c4924418f98d34b926b6 not found: ID does not exist" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.015284 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.019478 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd"] Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.022269 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cb9c8f9-wmwkd"] Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.343665 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.452532 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-client-ca\") pod \"44f27a7f-2ef1-43b1-990a-c759b55431d8\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.452597 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-config\") pod \"44f27a7f-2ef1-43b1-990a-c759b55431d8\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.452630 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kklm8\" (UniqueName: \"kubernetes.io/projected/44f27a7f-2ef1-43b1-990a-c759b55431d8-kube-api-access-kklm8\") pod \"44f27a7f-2ef1-43b1-990a-c759b55431d8\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.452755 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-proxy-ca-bundles\") pod \"44f27a7f-2ef1-43b1-990a-c759b55431d8\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.452806 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f27a7f-2ef1-43b1-990a-c759b55431d8-serving-cert\") pod \"44f27a7f-2ef1-43b1-990a-c759b55431d8\" (UID: \"44f27a7f-2ef1-43b1-990a-c759b55431d8\") " Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.454014 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-config" (OuterVolumeSpecName: "config") pod "44f27a7f-2ef1-43b1-990a-c759b55431d8" (UID: "44f27a7f-2ef1-43b1-990a-c759b55431d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.454904 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "44f27a7f-2ef1-43b1-990a-c759b55431d8" (UID: "44f27a7f-2ef1-43b1-990a-c759b55431d8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.455113 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-client-ca" (OuterVolumeSpecName: "client-ca") pod "44f27a7f-2ef1-43b1-990a-c759b55431d8" (UID: "44f27a7f-2ef1-43b1-990a-c759b55431d8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.457420 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f27a7f-2ef1-43b1-990a-c759b55431d8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "44f27a7f-2ef1-43b1-990a-c759b55431d8" (UID: "44f27a7f-2ef1-43b1-990a-c759b55431d8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.460069 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f27a7f-2ef1-43b1-990a-c759b55431d8-kube-api-access-kklm8" (OuterVolumeSpecName: "kube-api-access-kklm8") pod "44f27a7f-2ef1-43b1-990a-c759b55431d8" (UID: "44f27a7f-2ef1-43b1-990a-c759b55431d8"). InnerVolumeSpecName "kube-api-access-kklm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.554440 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.554472 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f27a7f-2ef1-43b1-990a-c759b55431d8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.554482 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.554491 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f27a7f-2ef1-43b1-990a-c759b55431d8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.554500 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kklm8\" (UniqueName: \"kubernetes.io/projected/44f27a7f-2ef1-43b1-990a-c759b55431d8-kube-api-access-kklm8\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.618560 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw"] Mar 10 14:06:09 crc kubenswrapper[4911]: W0310 14:06:09.639947 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63fc9dce_4616_49b0_bfec_62499b691d1a.slice/crio-ebe92ee0520796a0f9d4a74b4fa20e394c2a501f656cf99c233b23f8d5b36335 WatchSource:0}: Error finding container ebe92ee0520796a0f9d4a74b4fa20e394c2a501f656cf99c233b23f8d5b36335: Status 404 returned error can't find the container with id ebe92ee0520796a0f9d4a74b4fa20e394c2a501f656cf99c233b23f8d5b36335 Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.707639 4911 generic.go:334] "Generic (PLEG): container finished" podID="44f27a7f-2ef1-43b1-990a-c759b55431d8" containerID="17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809" exitCode=0 Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.707787 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" event={"ID":"44f27a7f-2ef1-43b1-990a-c759b55431d8","Type":"ContainerDied","Data":"17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809"} Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.707823 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" event={"ID":"44f27a7f-2ef1-43b1-990a-c759b55431d8","Type":"ContainerDied","Data":"7d47faa62f50e3a56a37baf184423a61001b1514190ddf53b7e9c35f2045a516"} Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.707845 4911 scope.go:117] "RemoveContainer" containerID="17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.707975 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-688fb998b9-wjsrz" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.711139 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" event={"ID":"63fc9dce-4616-49b0-bfec-62499b691d1a","Type":"ContainerStarted","Data":"ebe92ee0520796a0f9d4a74b4fa20e394c2a501f656cf99c233b23f8d5b36335"} Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.712333 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.712378 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.754910 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-688fb998b9-wjsrz"] Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.757309 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-688fb998b9-wjsrz"] Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.854837 4911 scope.go:117] "RemoveContainer" containerID="17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809" Mar 10 14:06:09 crc kubenswrapper[4911]: E0310 14:06:09.855416 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809\": container with ID starting with 17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809 not found: ID does not exist" containerID="17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.855454 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809"} err="failed to get container status \"17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809\": rpc error: code = NotFound desc = could not find container \"17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809\": container with ID starting with 17d73ced03c441915611a6541e68797ec42152948e529239fbd68aa836271809 not found: ID does not exist" Mar 10 14:06:09 crc kubenswrapper[4911]: I0310 14:06:09.946265 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.065147 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3398d162-faa8-41bb-afc3-d1f8a645db17-kubelet-dir\") pod \"3398d162-faa8-41bb-afc3-d1f8a645db17\" (UID: \"3398d162-faa8-41bb-afc3-d1f8a645db17\") " Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.065206 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3398d162-faa8-41bb-afc3-d1f8a645db17-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3398d162-faa8-41bb-afc3-d1f8a645db17" (UID: "3398d162-faa8-41bb-afc3-d1f8a645db17"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.065405 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3398d162-faa8-41bb-afc3-d1f8a645db17-kube-api-access\") pod \"3398d162-faa8-41bb-afc3-d1f8a645db17\" (UID: \"3398d162-faa8-41bb-afc3-d1f8a645db17\") " Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.065947 4911 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3398d162-faa8-41bb-afc3-d1f8a645db17-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.070322 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3398d162-faa8-41bb-afc3-d1f8a645db17-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3398d162-faa8-41bb-afc3-d1f8a645db17" (UID: "3398d162-faa8-41bb-afc3-d1f8a645db17"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.167186 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3398d162-faa8-41bb-afc3-d1f8a645db17-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.204418 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f27a7f-2ef1-43b1-990a-c759b55431d8" path="/var/lib/kubelet/pods/44f27a7f-2ef1-43b1-990a-c759b55431d8/volumes" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.205134 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4b1c8f2-8cc8-42e2-95f2-523c53b742f4" path="/var/lib/kubelet/pods/c4b1c8f2-8cc8-42e2-95f2-523c53b742f4/volumes" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.719008 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84f75b867c-q6wkm"] Mar 10 14:06:10 crc kubenswrapper[4911]: E0310 14:06:10.719350 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f27a7f-2ef1-43b1-990a-c759b55431d8" containerName="controller-manager" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.719370 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f27a7f-2ef1-43b1-990a-c759b55431d8" containerName="controller-manager" Mar 10 14:06:10 crc kubenswrapper[4911]: E0310 14:06:10.719383 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3398d162-faa8-41bb-afc3-d1f8a645db17" containerName="pruner" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.719393 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3398d162-faa8-41bb-afc3-d1f8a645db17" containerName="pruner" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.719560 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3398d162-faa8-41bb-afc3-d1f8a645db17" containerName="pruner" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.719578 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f27a7f-2ef1-43b1-990a-c759b55431d8" containerName="controller-manager" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.720066 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.720089 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" event={"ID":"63fc9dce-4616-49b0-bfec-62499b691d1a","Type":"ContainerStarted","Data":"29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36"} Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.720215 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.721508 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3398d162-faa8-41bb-afc3-d1f8a645db17","Type":"ContainerDied","Data":"8a13544b1c35ddd65dd9b66528822c08891cd24f23b98928dcdd0c17967a140a"} Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.721533 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.721536 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a13544b1c35ddd65dd9b66528822c08891cd24f23b98928dcdd0c17967a140a" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.722768 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.723023 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.723029 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.726465 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knb5d" event={"ID":"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9","Type":"ContainerStarted","Data":"c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff"} Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.727374 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.727406 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.727607 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.730822 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.732875 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.735651 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84f75b867c-q6wkm"] Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.750003 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" podStartSLOduration=10.749974595 podStartE2EDuration="10.749974595s" podCreationTimestamp="2026-03-10 14:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:06:10.746156973 +0000 UTC m=+275.309676880" watchObservedRunningTime="2026-03-10 14:06:10.749974595 +0000 UTC m=+275.313494512" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.777282 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-client-ca\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.777387 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-proxy-ca-bundles\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.778773 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d026d19-baf5-42f4-9cd4-10304d8f611e-serving-cert\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.778850 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-config\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.778889 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4xl\" (UniqueName: \"kubernetes.io/projected/2d026d19-baf5-42f4-9cd4-10304d8f611e-kube-api-access-xk4xl\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.802369 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-knb5d" podStartSLOduration=7.213912569 podStartE2EDuration="46.802344157s" podCreationTimestamp="2026-03-10 14:05:24 +0000 UTC" firstStartedPulling="2026-03-10 14:05:29.842463591 +0000 UTC m=+234.405983508" lastFinishedPulling="2026-03-10 14:06:09.430895169 +0000 UTC m=+273.994415096" observedRunningTime="2026-03-10 14:06:10.801462596 +0000 UTC m=+275.364982523" watchObservedRunningTime="2026-03-10 14:06:10.802344157 +0000 UTC m=+275.365864074" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.880938 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d026d19-baf5-42f4-9cd4-10304d8f611e-serving-cert\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.881037 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-config\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.881074 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4xl\" (UniqueName: \"kubernetes.io/projected/2d026d19-baf5-42f4-9cd4-10304d8f611e-kube-api-access-xk4xl\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.881131 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-client-ca\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.881152 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-proxy-ca-bundles\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.883089 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-proxy-ca-bundles\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.883518 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-config\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.883799 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-client-ca\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.886193 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d026d19-baf5-42f4-9cd4-10304d8f611e-serving-cert\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:10 crc kubenswrapper[4911]: I0310 14:06:10.901294 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4xl\" (UniqueName: \"kubernetes.io/projected/2d026d19-baf5-42f4-9cd4-10304d8f611e-kube-api-access-xk4xl\") pod \"controller-manager-84f75b867c-q6wkm\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.044245 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.317292 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84f75b867c-q6wkm"] Mar 10 14:06:11 crc kubenswrapper[4911]: W0310 14:06:11.359578 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d026d19_baf5_42f4_9cd4_10304d8f611e.slice/crio-deb8ff2804341a5b9e4ec5223e967003ff4d4f5464c909161ab20932f6bb3b42 WatchSource:0}: Error finding container deb8ff2804341a5b9e4ec5223e967003ff4d4f5464c909161ab20932f6bb3b42: Status 404 returned error can't find the container with id deb8ff2804341a5b9e4ec5223e967003ff4d4f5464c909161ab20932f6bb3b42 Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.773299 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" event={"ID":"2d026d19-baf5-42f4-9cd4-10304d8f611e","Type":"ContainerStarted","Data":"8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15"} Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.773704 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.773752 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" event={"ID":"2d026d19-baf5-42f4-9cd4-10304d8f611e","Type":"ContainerStarted","Data":"deb8ff2804341a5b9e4ec5223e967003ff4d4f5464c909161ab20932f6bb3b42"} Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.775949 4911 generic.go:334] "Generic (PLEG): container finished" podID="d90ef4fc-8e97-468a-b0ba-d7105067b50c" containerID="bf3efacd370b2cff9d688064dd0261963b117775fbf8ca51b9b199b509f410e4" exitCode=0 Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.776125 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552526-hfmpq" event={"ID":"d90ef4fc-8e97-468a-b0ba-d7105067b50c","Type":"ContainerDied","Data":"bf3efacd370b2cff9d688064dd0261963b117775fbf8ca51b9b199b509f410e4"} Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.777684 4911 patch_prober.go:28] interesting pod/controller-manager-84f75b867c-q6wkm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.777962 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" podUID="2d026d19-baf5-42f4-9cd4-10304d8f611e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 10 14:06:11 crc kubenswrapper[4911]: I0310 14:06:11.800878 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" podStartSLOduration=11.800845137 podStartE2EDuration="11.800845137s" podCreationTimestamp="2026-03-10 14:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:06:11.796995344 +0000 UTC m=+276.360515271" watchObservedRunningTime="2026-03-10 14:06:11.800845137 +0000 UTC m=+276.364365054" Mar 10 14:06:12 crc kubenswrapper[4911]: I0310 14:06:12.790162 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:13 crc kubenswrapper[4911]: I0310 14:06:13.100533 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552526-hfmpq" Mar 10 14:06:13 crc kubenswrapper[4911]: I0310 14:06:13.227927 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvrql\" (UniqueName: \"kubernetes.io/projected/d90ef4fc-8e97-468a-b0ba-d7105067b50c-kube-api-access-jvrql\") pod \"d90ef4fc-8e97-468a-b0ba-d7105067b50c\" (UID: \"d90ef4fc-8e97-468a-b0ba-d7105067b50c\") " Mar 10 14:06:13 crc kubenswrapper[4911]: I0310 14:06:13.234588 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90ef4fc-8e97-468a-b0ba-d7105067b50c-kube-api-access-jvrql" (OuterVolumeSpecName: "kube-api-access-jvrql") pod "d90ef4fc-8e97-468a-b0ba-d7105067b50c" (UID: "d90ef4fc-8e97-468a-b0ba-d7105067b50c"). InnerVolumeSpecName "kube-api-access-jvrql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:13 crc kubenswrapper[4911]: I0310 14:06:13.329763 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvrql\" (UniqueName: \"kubernetes.io/projected/d90ef4fc-8e97-468a-b0ba-d7105067b50c-kube-api-access-jvrql\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:13.706917 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:13.707029 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:13.707002 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfhwf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:13.707169 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nfhwf" podUID="1ad00bf3-d146-4d5d-806e-fb340e3762bf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.39:8080/\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:13.788277 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552526-hfmpq" event={"ID":"d90ef4fc-8e97-468a-b0ba-d7105067b50c","Type":"ContainerDied","Data":"d45380be9f2deb138a83754cc5c267ad1042f92666f6ebefb2bca4ee167bdeeb"} Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:13.788343 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45380be9f2deb138a83754cc5c267ad1042f92666f6ebefb2bca4ee167bdeeb" Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:13.788291 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552526-hfmpq" Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:14.449734 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:06:14 crc kubenswrapper[4911]: I0310 14:06:14.450109 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:06:16 crc kubenswrapper[4911]: I0310 14:06:16.075444 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-knb5d" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="registry-server" probeResult="failure" output=< Mar 10 14:06:16 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 14:06:16 crc kubenswrapper[4911]: > Mar 10 14:06:17 crc kubenswrapper[4911]: I0310 14:06:17.815674 4911 generic.go:334] "Generic (PLEG): container finished" podID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerID="e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157" exitCode=0 Mar 10 14:06:17 crc kubenswrapper[4911]: I0310 14:06:17.815810 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttx9c" event={"ID":"9549430d-b06c-4c28-87bc-6320e73c31e5","Type":"ContainerDied","Data":"e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157"} Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.521808 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.522601 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.522661 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.524135 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.524228 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950" gracePeriod=600 Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.826642 4911 generic.go:334] "Generic (PLEG): container finished" podID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerID="2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0" exitCode=0 Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.826710 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nht" event={"ID":"744c8ea6-3be6-496d-a6fc-002d3f0f95e4","Type":"ContainerDied","Data":"2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0"} Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.832899 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950" exitCode=0 Mar 10 14:06:18 crc kubenswrapper[4911]: I0310 14:06:18.832943 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950"} Mar 10 14:06:19 crc kubenswrapper[4911]: I0310 14:06:19.841680 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttx9c" event={"ID":"9549430d-b06c-4c28-87bc-6320e73c31e5","Type":"ContainerStarted","Data":"e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19"} Mar 10 14:06:19 crc kubenswrapper[4911]: I0310 14:06:19.845061 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nht" event={"ID":"744c8ea6-3be6-496d-a6fc-002d3f0f95e4","Type":"ContainerStarted","Data":"d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd"} Mar 10 14:06:19 crc kubenswrapper[4911]: I0310 14:06:19.849217 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"0bb21d0d9028cc517f2b21aadcd53128eee7aee107df38fbfad5959639c7f688"} Mar 10 14:06:19 crc kubenswrapper[4911]: I0310 14:06:19.883512 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ttx9c" podStartSLOduration=4.909711869 podStartE2EDuration="59.883485683s" podCreationTimestamp="2026-03-10 14:05:20 +0000 UTC" firstStartedPulling="2026-03-10 14:05:23.783674935 +0000 UTC m=+228.347194852" lastFinishedPulling="2026-03-10 14:06:18.757448749 +0000 UTC m=+283.320968666" observedRunningTime="2026-03-10 14:06:19.882774176 +0000 UTC m=+284.446294093" watchObservedRunningTime="2026-03-10 14:06:19.883485683 +0000 UTC m=+284.447005600" Mar 10 14:06:19 crc kubenswrapper[4911]: I0310 14:06:19.928520 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-48nht" podStartSLOduration=7.485957908 podStartE2EDuration="56.928500038s" podCreationTimestamp="2026-03-10 14:05:23 +0000 UTC" firstStartedPulling="2026-03-10 14:05:29.836802155 +0000 UTC m=+234.400322112" lastFinishedPulling="2026-03-10 14:06:19.279344325 +0000 UTC m=+283.842864242" observedRunningTime="2026-03-10 14:06:19.924708986 +0000 UTC m=+284.488228903" watchObservedRunningTime="2026-03-10 14:06:19.928500038 +0000 UTC m=+284.492019955" Mar 10 14:06:21 crc kubenswrapper[4911]: I0310 14:06:21.338265 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:06:21 crc kubenswrapper[4911]: I0310 14:06:21.338863 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:06:21 crc kubenswrapper[4911]: I0310 14:06:21.485536 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:06:21 crc kubenswrapper[4911]: I0310 14:06:21.862266 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hlsw" event={"ID":"713a2021-c450-41c7-b93d-ccee816a9820","Type":"ContainerStarted","Data":"15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714"} Mar 10 14:06:22 crc kubenswrapper[4911]: I0310 14:06:22.872336 4911 generic.go:334] "Generic (PLEG): container finished" podID="713a2021-c450-41c7-b93d-ccee816a9820" containerID="15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714" exitCode=0 Mar 10 14:06:22 crc kubenswrapper[4911]: I0310 14:06:22.872392 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hlsw" event={"ID":"713a2021-c450-41c7-b93d-ccee816a9820","Type":"ContainerDied","Data":"15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714"} Mar 10 14:06:23 crc kubenswrapper[4911]: I0310 14:06:23.726472 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nfhwf" Mar 10 14:06:23 crc kubenswrapper[4911]: I0310 14:06:23.954199 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:06:23 crc kubenswrapper[4911]: I0310 14:06:23.954270 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:06:23 crc kubenswrapper[4911]: I0310 14:06:23.994699 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.512300 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.578596 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.890231 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w2clr" event={"ID":"e1796b5b-f2e1-4a7a-9463-039bb296626a","Type":"ContainerStarted","Data":"936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e"} Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.892692 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrd7f" event={"ID":"124dcf69-acd6-4d61-ab64-3cf0840df098","Type":"ContainerStarted","Data":"e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1"} Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.895424 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hlsw" event={"ID":"713a2021-c450-41c7-b93d-ccee816a9820","Type":"ContainerStarted","Data":"7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa"} Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.897840 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvtlp" event={"ID":"b986951d-80c8-4f06-a12b-9dd8047a7bf5","Type":"ContainerStarted","Data":"a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38"} Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.900041 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pm6m" event={"ID":"f11c754a-10a0-46ac-b171-5ccfecebdb7c","Type":"ContainerStarted","Data":"bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa"} Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.974151 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:06:24 crc kubenswrapper[4911]: I0310 14:06:24.976987 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9hlsw" podStartSLOduration=3.287190418 podStartE2EDuration="1m3.976972349s" podCreationTimestamp="2026-03-10 14:05:21 +0000 UTC" firstStartedPulling="2026-03-10 14:05:23.7718564 +0000 UTC m=+228.335376317" lastFinishedPulling="2026-03-10 14:06:24.461638331 +0000 UTC m=+289.025158248" observedRunningTime="2026-03-10 14:06:24.976744364 +0000 UTC m=+289.540264291" watchObservedRunningTime="2026-03-10 14:06:24.976972349 +0000 UTC m=+289.540492266" Mar 10 14:06:25 crc kubenswrapper[4911]: I0310 14:06:25.907626 4911 generic.go:334] "Generic (PLEG): container finished" podID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerID="936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e" exitCode=0 Mar 10 14:06:25 crc kubenswrapper[4911]: I0310 14:06:25.908520 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w2clr" event={"ID":"e1796b5b-f2e1-4a7a-9463-039bb296626a","Type":"ContainerDied","Data":"936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e"} Mar 10 14:06:26 crc kubenswrapper[4911]: I0310 14:06:26.234931 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48nht"] Mar 10 14:06:26 crc kubenswrapper[4911]: I0310 14:06:26.952488 4911 generic.go:334] "Generic (PLEG): container finished" podID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerID="e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1" exitCode=0 Mar 10 14:06:26 crc kubenswrapper[4911]: I0310 14:06:26.952608 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrd7f" event={"ID":"124dcf69-acd6-4d61-ab64-3cf0840df098","Type":"ContainerDied","Data":"e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1"} Mar 10 14:06:26 crc kubenswrapper[4911]: I0310 14:06:26.952897 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-48nht" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerName="registry-server" containerID="cri-o://d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd" gracePeriod=2 Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.543405 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.680095 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvjl4\" (UniqueName: \"kubernetes.io/projected/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-kube-api-access-pvjl4\") pod \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.680216 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-utilities\") pod \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.680283 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-catalog-content\") pod \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\" (UID: \"744c8ea6-3be6-496d-a6fc-002d3f0f95e4\") " Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.681359 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-utilities" (OuterVolumeSpecName: "utilities") pod "744c8ea6-3be6-496d-a6fc-002d3f0f95e4" (UID: "744c8ea6-3be6-496d-a6fc-002d3f0f95e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.688906 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-kube-api-access-pvjl4" (OuterVolumeSpecName: "kube-api-access-pvjl4") pod "744c8ea6-3be6-496d-a6fc-002d3f0f95e4" (UID: "744c8ea6-3be6-496d-a6fc-002d3f0f95e4"). InnerVolumeSpecName "kube-api-access-pvjl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.706479 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "744c8ea6-3be6-496d-a6fc-002d3f0f95e4" (UID: "744c8ea6-3be6-496d-a6fc-002d3f0f95e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.781378 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.781415 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvjl4\" (UniqueName: \"kubernetes.io/projected/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-kube-api-access-pvjl4\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.781429 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744c8ea6-3be6-496d-a6fc-002d3f0f95e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.960222 4911 generic.go:334] "Generic (PLEG): container finished" podID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerID="d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd" exitCode=0 Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.960279 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nht" event={"ID":"744c8ea6-3be6-496d-a6fc-002d3f0f95e4","Type":"ContainerDied","Data":"d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd"} Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.960334 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nht" event={"ID":"744c8ea6-3be6-496d-a6fc-002d3f0f95e4","Type":"ContainerDied","Data":"6aa85add1472227f22140b3780ac28c2673f0e70ad9adb7552bb8c934137f1aa"} Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.960359 4911 scope.go:117] "RemoveContainer" containerID="d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.960818 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48nht" Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.963026 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w2clr" event={"ID":"e1796b5b-f2e1-4a7a-9463-039bb296626a","Type":"ContainerStarted","Data":"cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c"} Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.965101 4911 generic.go:334] "Generic (PLEG): container finished" podID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerID="a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38" exitCode=0 Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.965163 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvtlp" event={"ID":"b986951d-80c8-4f06-a12b-9dd8047a7bf5","Type":"ContainerDied","Data":"a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38"} Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.976493 4911 generic.go:334] "Generic (PLEG): container finished" podID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerID="bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa" exitCode=0 Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.976549 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pm6m" event={"ID":"f11c754a-10a0-46ac-b171-5ccfecebdb7c","Type":"ContainerDied","Data":"bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa"} Mar 10 14:06:27 crc kubenswrapper[4911]: I0310 14:06:27.985367 4911 scope.go:117] "RemoveContainer" containerID="2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.016243 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w2clr" podStartSLOduration=3.540285197 podStartE2EDuration="1m7.016219595s" podCreationTimestamp="2026-03-10 14:05:21 +0000 UTC" firstStartedPulling="2026-03-10 14:05:23.756398308 +0000 UTC m=+228.319918225" lastFinishedPulling="2026-03-10 14:06:27.232332706 +0000 UTC m=+291.795852623" observedRunningTime="2026-03-10 14:06:28.015286303 +0000 UTC m=+292.578806230" watchObservedRunningTime="2026-03-10 14:06:28.016219595 +0000 UTC m=+292.579739512" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.043885 4911 scope.go:117] "RemoveContainer" containerID="a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.048720 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48nht"] Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.056193 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-48nht"] Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.085534 4911 scope.go:117] "RemoveContainer" containerID="d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd" Mar 10 14:06:28 crc kubenswrapper[4911]: E0310 14:06:28.088595 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd\": container with ID starting with d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd not found: ID does not exist" containerID="d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.088648 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd"} err="failed to get container status \"d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd\": rpc error: code = NotFound desc = could not find container \"d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd\": container with ID starting with d8a4b804027e32e791e900cacfddb64c606b914aa9fc762d2481ce9df61862fd not found: ID does not exist" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.088685 4911 scope.go:117] "RemoveContainer" containerID="2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0" Mar 10 14:06:28 crc kubenswrapper[4911]: E0310 14:06:28.091264 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0\": container with ID starting with 2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0 not found: ID does not exist" containerID="2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.091318 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0"} err="failed to get container status \"2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0\": rpc error: code = NotFound desc = could not find container \"2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0\": container with ID starting with 2609bc2f411028346104d1ac9bbc7755127f0e1d71246ceefe1ef03c442d06f0 not found: ID does not exist" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.091350 4911 scope.go:117] "RemoveContainer" containerID="a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e" Mar 10 14:06:28 crc kubenswrapper[4911]: E0310 14:06:28.092050 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e\": container with ID starting with a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e not found: ID does not exist" containerID="a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.092123 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e"} err="failed to get container status \"a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e\": rpc error: code = NotFound desc = could not find container \"a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e\": container with ID starting with a69d814a61c15c8bfce3caf1c959d2a201f5d5cb4b6558b7ad5fe6ec3caf114e not found: ID does not exist" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.203164 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" path="/var/lib/kubelet/pods/744c8ea6-3be6-496d-a6fc-002d3f0f95e4/volumes" Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.986448 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrd7f" event={"ID":"124dcf69-acd6-4d61-ab64-3cf0840df098","Type":"ContainerStarted","Data":"d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a"} Mar 10 14:06:28 crc kubenswrapper[4911]: I0310 14:06:28.992103 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvtlp" event={"ID":"b986951d-80c8-4f06-a12b-9dd8047a7bf5","Type":"ContainerStarted","Data":"aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e"} Mar 10 14:06:29 crc kubenswrapper[4911]: I0310 14:06:29.010005 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vrd7f" podStartSLOduration=7.737807346 podStartE2EDuration="1m6.009983941s" podCreationTimestamp="2026-03-10 14:05:23 +0000 UTC" firstStartedPulling="2026-03-10 14:05:29.836844456 +0000 UTC m=+234.400364373" lastFinishedPulling="2026-03-10 14:06:28.109021051 +0000 UTC m=+292.672540968" observedRunningTime="2026-03-10 14:06:29.009090359 +0000 UTC m=+293.572610276" watchObservedRunningTime="2026-03-10 14:06:29.009983941 +0000 UTC m=+293.573503858" Mar 10 14:06:29 crc kubenswrapper[4911]: I0310 14:06:29.039251 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cvtlp" podStartSLOduration=3.347374247 podStartE2EDuration="1m8.039225555s" podCreationTimestamp="2026-03-10 14:05:21 +0000 UTC" firstStartedPulling="2026-03-10 14:05:23.779358091 +0000 UTC m=+228.342878018" lastFinishedPulling="2026-03-10 14:06:28.471209409 +0000 UTC m=+293.034729326" observedRunningTime="2026-03-10 14:06:29.03735851 +0000 UTC m=+293.600878427" watchObservedRunningTime="2026-03-10 14:06:29.039225555 +0000 UTC m=+293.602745472" Mar 10 14:06:30 crc kubenswrapper[4911]: I0310 14:06:30.010711 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pm6m" event={"ID":"f11c754a-10a0-46ac-b171-5ccfecebdb7c","Type":"ContainerStarted","Data":"7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2"} Mar 10 14:06:30 crc kubenswrapper[4911]: I0310 14:06:30.042491 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9pm6m" podStartSLOduration=12.225462692 podStartE2EDuration="1m6.04246062s" podCreationTimestamp="2026-03-10 14:05:24 +0000 UTC" firstStartedPulling="2026-03-10 14:05:35.070893181 +0000 UTC m=+239.634413098" lastFinishedPulling="2026-03-10 14:06:28.887891109 +0000 UTC m=+293.451411026" observedRunningTime="2026-03-10 14:06:30.037962002 +0000 UTC m=+294.601481929" watchObservedRunningTime="2026-03-10 14:06:30.04246062 +0000 UTC m=+294.605980547" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.403115 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.430218 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.430323 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.474040 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.888426 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.889427 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.934535 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.950505 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:06:31 crc kubenswrapper[4911]: I0310 14:06:31.950545 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:06:32 crc kubenswrapper[4911]: I0310 14:06:32.012582 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:06:32 crc kubenswrapper[4911]: I0310 14:06:32.061422 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:06:32 crc kubenswrapper[4911]: I0310 14:06:32.071669 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:06:33 crc kubenswrapper[4911]: I0310 14:06:33.576396 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:06:33 crc kubenswrapper[4911]: I0310 14:06:33.576591 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:06:33 crc kubenswrapper[4911]: I0310 14:06:33.623532 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:06:34 crc kubenswrapper[4911]: I0310 14:06:34.077364 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:06:34 crc kubenswrapper[4911]: I0310 14:06:34.834584 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hlsw"] Mar 10 14:06:34 crc kubenswrapper[4911]: I0310 14:06:34.867441 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:06:34 crc kubenswrapper[4911]: I0310 14:06:34.867504 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.040063 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9hlsw" podUID="713a2021-c450-41c7-b93d-ccee816a9820" containerName="registry-server" containerID="cri-o://7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa" gracePeriod=2 Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.565524 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.620023 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-catalog-content\") pod \"713a2021-c450-41c7-b93d-ccee816a9820\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.620174 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-utilities\") pod \"713a2021-c450-41c7-b93d-ccee816a9820\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.620219 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-884ch\" (UniqueName: \"kubernetes.io/projected/713a2021-c450-41c7-b93d-ccee816a9820-kube-api-access-884ch\") pod \"713a2021-c450-41c7-b93d-ccee816a9820\" (UID: \"713a2021-c450-41c7-b93d-ccee816a9820\") " Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.621116 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-utilities" (OuterVolumeSpecName: "utilities") pod "713a2021-c450-41c7-b93d-ccee816a9820" (UID: "713a2021-c450-41c7-b93d-ccee816a9820"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.630671 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713a2021-c450-41c7-b93d-ccee816a9820-kube-api-access-884ch" (OuterVolumeSpecName: "kube-api-access-884ch") pod "713a2021-c450-41c7-b93d-ccee816a9820" (UID: "713a2021-c450-41c7-b93d-ccee816a9820"). InnerVolumeSpecName "kube-api-access-884ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.671629 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "713a2021-c450-41c7-b93d-ccee816a9820" (UID: "713a2021-c450-41c7-b93d-ccee816a9820"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.722243 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.722279 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/713a2021-c450-41c7-b93d-ccee816a9820-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.722290 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-884ch\" (UniqueName: \"kubernetes.io/projected/713a2021-c450-41c7-b93d-ccee816a9820-kube-api-access-884ch\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:35 crc kubenswrapper[4911]: I0310 14:06:35.905730 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9pm6m" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="registry-server" probeResult="failure" output=< Mar 10 14:06:35 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 14:06:35 crc kubenswrapper[4911]: > Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.048601 4911 generic.go:334] "Generic (PLEG): container finished" podID="713a2021-c450-41c7-b93d-ccee816a9820" containerID="7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa" exitCode=0 Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.048900 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hlsw" event={"ID":"713a2021-c450-41c7-b93d-ccee816a9820","Type":"ContainerDied","Data":"7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa"} Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.048978 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hlsw" event={"ID":"713a2021-c450-41c7-b93d-ccee816a9820","Type":"ContainerDied","Data":"ffed09385d3b82fe6e7bb82bc007e54dcccde72716996c51881b682b6de0d5ca"} Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.049007 4911 scope.go:117] "RemoveContainer" containerID="7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.049007 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hlsw" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.079924 4911 scope.go:117] "RemoveContainer" containerID="15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.087602 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hlsw"] Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.090825 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9hlsw"] Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.108197 4911 scope.go:117] "RemoveContainer" containerID="cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.131053 4911 scope.go:117] "RemoveContainer" containerID="7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa" Mar 10 14:06:36 crc kubenswrapper[4911]: E0310 14:06:36.132048 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa\": container with ID starting with 7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa not found: ID does not exist" containerID="7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.132075 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa"} err="failed to get container status \"7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa\": rpc error: code = NotFound desc = could not find container \"7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa\": container with ID starting with 7a4dcd485a59d15a91dae889b9a769a39e45d9d4d1307c30aa68f6df5a2bd6fa not found: ID does not exist" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.132097 4911 scope.go:117] "RemoveContainer" containerID="15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714" Mar 10 14:06:36 crc kubenswrapper[4911]: E0310 14:06:36.132436 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714\": container with ID starting with 15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714 not found: ID does not exist" containerID="15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.132455 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714"} err="failed to get container status \"15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714\": rpc error: code = NotFound desc = could not find container \"15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714\": container with ID starting with 15e0933a48bb905775587c9f7fec5f72e21bc22f97fbafb26f676aae79135714 not found: ID does not exist" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.132467 4911 scope.go:117] "RemoveContainer" containerID="cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b" Mar 10 14:06:36 crc kubenswrapper[4911]: E0310 14:06:36.132887 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b\": container with ID starting with cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b not found: ID does not exist" containerID="cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.132910 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b"} err="failed to get container status \"cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b\": rpc error: code = NotFound desc = could not find container \"cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b\": container with ID starting with cbc4e9441a575072a7827bb898b75b9c014fd9d83a6da2d2fc80eec9d67d210b not found: ID does not exist" Mar 10 14:06:36 crc kubenswrapper[4911]: I0310 14:06:36.202438 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713a2021-c450-41c7-b93d-ccee816a9820" path="/var/lib/kubelet/pods/713a2021-c450-41c7-b93d-ccee816a9820/volumes" Mar 10 14:06:40 crc kubenswrapper[4911]: I0310 14:06:40.433257 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84f75b867c-q6wkm"] Mar 10 14:06:40 crc kubenswrapper[4911]: I0310 14:06:40.434100 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" podUID="2d026d19-baf5-42f4-9cd4-10304d8f611e" containerName="controller-manager" containerID="cri-o://8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15" gracePeriod=30 Mar 10 14:06:40 crc kubenswrapper[4911]: I0310 14:06:40.515142 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw"] Mar 10 14:06:40 crc kubenswrapper[4911]: I0310 14:06:40.515363 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" podUID="63fc9dce-4616-49b0-bfec-62499b691d1a" containerName="route-controller-manager" containerID="cri-o://29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36" gracePeriod=30 Mar 10 14:06:40 crc kubenswrapper[4911]: I0310 14:06:40.975590 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.007111 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.085324 4911 generic.go:334] "Generic (PLEG): container finished" podID="63fc9dce-4616-49b0-bfec-62499b691d1a" containerID="29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36" exitCode=0 Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.085408 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" event={"ID":"63fc9dce-4616-49b0-bfec-62499b691d1a","Type":"ContainerDied","Data":"29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36"} Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.085456 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" event={"ID":"63fc9dce-4616-49b0-bfec-62499b691d1a","Type":"ContainerDied","Data":"ebe92ee0520796a0f9d4a74b4fa20e394c2a501f656cf99c233b23f8d5b36335"} Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.085477 4911 scope.go:117] "RemoveContainer" containerID="29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.085422 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.087340 4911 generic.go:334] "Generic (PLEG): container finished" podID="2d026d19-baf5-42f4-9cd4-10304d8f611e" containerID="8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15" exitCode=0 Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.087368 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" event={"ID":"2d026d19-baf5-42f4-9cd4-10304d8f611e","Type":"ContainerDied","Data":"8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15"} Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.087382 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" event={"ID":"2d026d19-baf5-42f4-9cd4-10304d8f611e","Type":"ContainerDied","Data":"deb8ff2804341a5b9e4ec5223e967003ff4d4f5464c909161ab20932f6bb3b42"} Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.087419 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f75b867c-q6wkm" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.101467 4911 scope.go:117] "RemoveContainer" containerID="29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.102495 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36\": container with ID starting with 29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36 not found: ID does not exist" containerID="29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.102527 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36"} err="failed to get container status \"29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36\": rpc error: code = NotFound desc = could not find container \"29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36\": container with ID starting with 29c02f4382e685c646a04091197a3cb3fd77fbdbc08bb42fae700b6b63be6a36 not found: ID does not exist" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.102572 4911 scope.go:117] "RemoveContainer" containerID="8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.109812 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-client-ca\") pod \"2d026d19-baf5-42f4-9cd4-10304d8f611e\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.109862 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc9dce-4616-49b0-bfec-62499b691d1a-serving-cert\") pod \"63fc9dce-4616-49b0-bfec-62499b691d1a\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.109998 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-proxy-ca-bundles\") pod \"2d026d19-baf5-42f4-9cd4-10304d8f611e\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.110059 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-config\") pod \"2d026d19-baf5-42f4-9cd4-10304d8f611e\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.110157 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-config\") pod \"63fc9dce-4616-49b0-bfec-62499b691d1a\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.110239 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk4xl\" (UniqueName: \"kubernetes.io/projected/2d026d19-baf5-42f4-9cd4-10304d8f611e-kube-api-access-xk4xl\") pod \"2d026d19-baf5-42f4-9cd4-10304d8f611e\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.110342 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-client-ca\") pod \"63fc9dce-4616-49b0-bfec-62499b691d1a\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.110398 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m8qr\" (UniqueName: \"kubernetes.io/projected/63fc9dce-4616-49b0-bfec-62499b691d1a-kube-api-access-9m8qr\") pod \"63fc9dce-4616-49b0-bfec-62499b691d1a\" (UID: \"63fc9dce-4616-49b0-bfec-62499b691d1a\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.110429 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d026d19-baf5-42f4-9cd4-10304d8f611e-serving-cert\") pod \"2d026d19-baf5-42f4-9cd4-10304d8f611e\" (UID: \"2d026d19-baf5-42f4-9cd4-10304d8f611e\") " Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.111207 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-client-ca" (OuterVolumeSpecName: "client-ca") pod "2d026d19-baf5-42f4-9cd4-10304d8f611e" (UID: "2d026d19-baf5-42f4-9cd4-10304d8f611e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.112391 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-client-ca" (OuterVolumeSpecName: "client-ca") pod "63fc9dce-4616-49b0-bfec-62499b691d1a" (UID: "63fc9dce-4616-49b0-bfec-62499b691d1a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.112557 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-config" (OuterVolumeSpecName: "config") pod "63fc9dce-4616-49b0-bfec-62499b691d1a" (UID: "63fc9dce-4616-49b0-bfec-62499b691d1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.112885 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2d026d19-baf5-42f4-9cd4-10304d8f611e" (UID: "2d026d19-baf5-42f4-9cd4-10304d8f611e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.113290 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-config" (OuterVolumeSpecName: "config") pod "2d026d19-baf5-42f4-9cd4-10304d8f611e" (UID: "2d026d19-baf5-42f4-9cd4-10304d8f611e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.116695 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d026d19-baf5-42f4-9cd4-10304d8f611e-kube-api-access-xk4xl" (OuterVolumeSpecName: "kube-api-access-xk4xl") pod "2d026d19-baf5-42f4-9cd4-10304d8f611e" (UID: "2d026d19-baf5-42f4-9cd4-10304d8f611e"). InnerVolumeSpecName "kube-api-access-xk4xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.117006 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63fc9dce-4616-49b0-bfec-62499b691d1a-kube-api-access-9m8qr" (OuterVolumeSpecName: "kube-api-access-9m8qr") pod "63fc9dce-4616-49b0-bfec-62499b691d1a" (UID: "63fc9dce-4616-49b0-bfec-62499b691d1a"). InnerVolumeSpecName "kube-api-access-9m8qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.117092 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d026d19-baf5-42f4-9cd4-10304d8f611e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2d026d19-baf5-42f4-9cd4-10304d8f611e" (UID: "2d026d19-baf5-42f4-9cd4-10304d8f611e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.117216 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63fc9dce-4616-49b0-bfec-62499b691d1a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63fc9dce-4616-49b0-bfec-62499b691d1a" (UID: "63fc9dce-4616-49b0-bfec-62499b691d1a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.120000 4911 scope.go:117] "RemoveContainer" containerID="8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.121412 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15\": container with ID starting with 8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15 not found: ID does not exist" containerID="8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.122154 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15"} err="failed to get container status \"8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15\": rpc error: code = NotFound desc = could not find container \"8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15\": container with ID starting with 8fe2098b0096bec79c43d87967181171a3e7fbe859bc43934b828c68282d5d15 not found: ID does not exist" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212126 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212168 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212178 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk4xl\" (UniqueName: \"kubernetes.io/projected/2d026d19-baf5-42f4-9cd4-10304d8f611e-kube-api-access-xk4xl\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212192 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc9dce-4616-49b0-bfec-62499b691d1a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212202 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m8qr\" (UniqueName: \"kubernetes.io/projected/63fc9dce-4616-49b0-bfec-62499b691d1a-kube-api-access-9m8qr\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212214 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d026d19-baf5-42f4-9cd4-10304d8f611e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212223 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212231 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc9dce-4616-49b0-bfec-62499b691d1a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.212240 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d026d19-baf5-42f4-9cd4-10304d8f611e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.419537 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw"] Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.422414 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86786b9875-7z6nw"] Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.431168 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84f75b867c-q6wkm"] Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.435658 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-84f75b867c-q6wkm"] Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.547976 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7tfvh"] Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.738268 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq"] Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.738873 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90ef4fc-8e97-468a-b0ba-d7105067b50c" containerName="oc" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.738888 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90ef4fc-8e97-468a-b0ba-d7105067b50c" containerName="oc" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.738898 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d026d19-baf5-42f4-9cd4-10304d8f611e" containerName="controller-manager" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.738906 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d026d19-baf5-42f4-9cd4-10304d8f611e" containerName="controller-manager" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.738917 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713a2021-c450-41c7-b93d-ccee816a9820" containerName="extract-content" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.738923 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="713a2021-c450-41c7-b93d-ccee816a9820" containerName="extract-content" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.738932 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fc9dce-4616-49b0-bfec-62499b691d1a" containerName="route-controller-manager" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.738940 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fc9dce-4616-49b0-bfec-62499b691d1a" containerName="route-controller-manager" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.738956 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerName="extract-content" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.738963 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerName="extract-content" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.738971 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713a2021-c450-41c7-b93d-ccee816a9820" containerName="registry-server" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.738978 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="713a2021-c450-41c7-b93d-ccee816a9820" containerName="registry-server" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.738988 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerName="extract-utilities" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.738994 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerName="extract-utilities" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.739003 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerName="registry-server" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.739009 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerName="registry-server" Mar 10 14:06:41 crc kubenswrapper[4911]: E0310 14:06:41.739021 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713a2021-c450-41c7-b93d-ccee816a9820" containerName="extract-utilities" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.739027 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="713a2021-c450-41c7-b93d-ccee816a9820" containerName="extract-utilities" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.739123 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="744c8ea6-3be6-496d-a6fc-002d3f0f95e4" containerName="registry-server" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.739136 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90ef4fc-8e97-468a-b0ba-d7105067b50c" containerName="oc" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.739145 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d026d19-baf5-42f4-9cd4-10304d8f611e" containerName="controller-manager" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.739154 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="63fc9dce-4616-49b0-bfec-62499b691d1a" containerName="route-controller-manager" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.739166 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="713a2021-c450-41c7-b93d-ccee816a9820" containerName="registry-server" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.739605 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.741871 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.742095 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.742378 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.742668 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.742865 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.744350 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-759b87b8f8-q4nj8"] Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.745428 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.746654 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.746944 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.748200 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.749955 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.750313 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.750656 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.754146 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-759b87b8f8-q4nj8"] Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.757906 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq"] Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.759445 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.763325 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821288 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-proxy-ca-bundles\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821338 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-config\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821368 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-config\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821430 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-client-ca\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821464 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdhwk\" (UniqueName: \"kubernetes.io/projected/44c44298-3652-4f8e-af18-2e038a8e5ecd-kube-api-access-fdhwk\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821532 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c44298-3652-4f8e-af18-2e038a8e5ecd-serving-cert\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821570 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6wkp\" (UniqueName: \"kubernetes.io/projected/8bff43b0-a953-404b-a957-2afab9373552-kube-api-access-x6wkp\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821656 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bff43b0-a953-404b-a957-2afab9373552-serving-cert\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.821685 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-client-ca\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.922822 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c44298-3652-4f8e-af18-2e038a8e5ecd-serving-cert\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.922894 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6wkp\" (UniqueName: \"kubernetes.io/projected/8bff43b0-a953-404b-a957-2afab9373552-kube-api-access-x6wkp\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.922942 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bff43b0-a953-404b-a957-2afab9373552-serving-cert\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.923382 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-client-ca\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.924401 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-client-ca\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.925336 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-proxy-ca-bundles\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.925377 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-proxy-ca-bundles\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.925414 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-config\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.925436 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-config\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.926468 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-config\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.926505 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-client-ca\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.926534 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdhwk\" (UniqueName: \"kubernetes.io/projected/44c44298-3652-4f8e-af18-2e038a8e5ecd-kube-api-access-fdhwk\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.926840 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-client-ca\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.926948 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-config\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.927514 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bff43b0-a953-404b-a957-2afab9373552-serving-cert\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.936012 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c44298-3652-4f8e-af18-2e038a8e5ecd-serving-cert\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.948850 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdhwk\" (UniqueName: \"kubernetes.io/projected/44c44298-3652-4f8e-af18-2e038a8e5ecd-kube-api-access-fdhwk\") pod \"route-controller-manager-567c4dddc8-qqjzq\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.955304 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6wkp\" (UniqueName: \"kubernetes.io/projected/8bff43b0-a953-404b-a957-2afab9373552-kube-api-access-x6wkp\") pod \"controller-manager-759b87b8f8-q4nj8\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:41 crc kubenswrapper[4911]: I0310 14:06:41.995834 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:06:42 crc kubenswrapper[4911]: I0310 14:06:42.057258 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:42 crc kubenswrapper[4911]: I0310 14:06:42.068470 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:42 crc kubenswrapper[4911]: I0310 14:06:42.204014 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d026d19-baf5-42f4-9cd4-10304d8f611e" path="/var/lib/kubelet/pods/2d026d19-baf5-42f4-9cd4-10304d8f611e/volumes" Mar 10 14:06:42 crc kubenswrapper[4911]: I0310 14:06:42.204785 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63fc9dce-4616-49b0-bfec-62499b691d1a" path="/var/lib/kubelet/pods/63fc9dce-4616-49b0-bfec-62499b691d1a/volumes" Mar 10 14:06:42 crc kubenswrapper[4911]: I0310 14:06:42.295532 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-759b87b8f8-q4nj8"] Mar 10 14:06:42 crc kubenswrapper[4911]: W0310 14:06:42.302176 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bff43b0_a953_404b_a957_2afab9373552.slice/crio-3b35a06e4ed3eb634246aca3e5e29efafcba293be0989a4619b2cb4cb618702b WatchSource:0}: Error finding container 3b35a06e4ed3eb634246aca3e5e29efafcba293be0989a4619b2cb4cb618702b: Status 404 returned error can't find the container with id 3b35a06e4ed3eb634246aca3e5e29efafcba293be0989a4619b2cb4cb618702b Mar 10 14:06:42 crc kubenswrapper[4911]: I0310 14:06:42.465957 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq"] Mar 10 14:06:42 crc kubenswrapper[4911]: W0310 14:06:42.471258 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c44298_3652_4f8e_af18_2e038a8e5ecd.slice/crio-ea8e75bbd11215fee96c2a0c2d26b1e8a26ecd881760a313a1163c9408f51753 WatchSource:0}: Error finding container ea8e75bbd11215fee96c2a0c2d26b1e8a26ecd881760a313a1163c9408f51753: Status 404 returned error can't find the container with id ea8e75bbd11215fee96c2a0c2d26b1e8a26ecd881760a313a1163c9408f51753 Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.112595 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" event={"ID":"8bff43b0-a953-404b-a957-2afab9373552","Type":"ContainerStarted","Data":"5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1"} Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.112919 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" event={"ID":"8bff43b0-a953-404b-a957-2afab9373552","Type":"ContainerStarted","Data":"3b35a06e4ed3eb634246aca3e5e29efafcba293be0989a4619b2cb4cb618702b"} Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.112939 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.115445 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" event={"ID":"44c44298-3652-4f8e-af18-2e038a8e5ecd","Type":"ContainerStarted","Data":"4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc"} Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.115503 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" event={"ID":"44c44298-3652-4f8e-af18-2e038a8e5ecd","Type":"ContainerStarted","Data":"ea8e75bbd11215fee96c2a0c2d26b1e8a26ecd881760a313a1163c9408f51753"} Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.115660 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.121041 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.134485 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" podStartSLOduration=3.134464845 podStartE2EDuration="3.134464845s" podCreationTimestamp="2026-03-10 14:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:06:43.132025176 +0000 UTC m=+307.695545113" watchObservedRunningTime="2026-03-10 14:06:43.134464845 +0000 UTC m=+307.697984762" Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.155901 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" podStartSLOduration=3.155877511 podStartE2EDuration="3.155877511s" podCreationTimestamp="2026-03-10 14:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:06:43.155265516 +0000 UTC m=+307.718785453" watchObservedRunningTime="2026-03-10 14:06:43.155877511 +0000 UTC m=+307.719397428" Mar 10 14:06:43 crc kubenswrapper[4911]: I0310 14:06:43.302906 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:06:44 crc kubenswrapper[4911]: I0310 14:06:44.634842 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cvtlp"] Mar 10 14:06:44 crc kubenswrapper[4911]: I0310 14:06:44.635109 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cvtlp" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerName="registry-server" containerID="cri-o://aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e" gracePeriod=2 Mar 10 14:06:44 crc kubenswrapper[4911]: I0310 14:06:44.912623 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:06:44 crc kubenswrapper[4911]: I0310 14:06:44.953307 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.015453 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.092436 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-utilities\") pod \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.092491 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-catalog-content\") pod \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.092584 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8jfr\" (UniqueName: \"kubernetes.io/projected/b986951d-80c8-4f06-a12b-9dd8047a7bf5-kube-api-access-c8jfr\") pod \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\" (UID: \"b986951d-80c8-4f06-a12b-9dd8047a7bf5\") " Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.093329 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-utilities" (OuterVolumeSpecName: "utilities") pod "b986951d-80c8-4f06-a12b-9dd8047a7bf5" (UID: "b986951d-80c8-4f06-a12b-9dd8047a7bf5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.098124 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b986951d-80c8-4f06-a12b-9dd8047a7bf5-kube-api-access-c8jfr" (OuterVolumeSpecName: "kube-api-access-c8jfr") pod "b986951d-80c8-4f06-a12b-9dd8047a7bf5" (UID: "b986951d-80c8-4f06-a12b-9dd8047a7bf5"). InnerVolumeSpecName "kube-api-access-c8jfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.130199 4911 generic.go:334] "Generic (PLEG): container finished" podID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerID="aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e" exitCode=0 Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.130375 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvtlp" event={"ID":"b986951d-80c8-4f06-a12b-9dd8047a7bf5","Type":"ContainerDied","Data":"aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e"} Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.130433 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvtlp" event={"ID":"b986951d-80c8-4f06-a12b-9dd8047a7bf5","Type":"ContainerDied","Data":"1119858619d302b238be1211ea5698919224e2f18b66de4050bcf65b71480887"} Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.130457 4911 scope.go:117] "RemoveContainer" containerID="aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.130596 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvtlp" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.151288 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b986951d-80c8-4f06-a12b-9dd8047a7bf5" (UID: "b986951d-80c8-4f06-a12b-9dd8047a7bf5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.155064 4911 scope.go:117] "RemoveContainer" containerID="a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.180977 4911 scope.go:117] "RemoveContainer" containerID="1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.193996 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.194035 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b986951d-80c8-4f06-a12b-9dd8047a7bf5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.194045 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8jfr\" (UniqueName: \"kubernetes.io/projected/b986951d-80c8-4f06-a12b-9dd8047a7bf5-kube-api-access-c8jfr\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.207863 4911 scope.go:117] "RemoveContainer" containerID="aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.208606 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e\": container with ID starting with aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e not found: ID does not exist" containerID="aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.208663 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e"} err="failed to get container status \"aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e\": rpc error: code = NotFound desc = could not find container \"aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e\": container with ID starting with aacdd87ff1b249e045e7c66a4ef770eeb99868764f24c1b351019082f7989e9e not found: ID does not exist" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.208691 4911 scope.go:117] "RemoveContainer" containerID="a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.209422 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38\": container with ID starting with a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38 not found: ID does not exist" containerID="a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.209451 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38"} err="failed to get container status \"a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38\": rpc error: code = NotFound desc = could not find container \"a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38\": container with ID starting with a7ca265212517eaefa10fd1f6c64ea8b39eb515226e763bb4d08858300647d38 not found: ID does not exist" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.209465 4911 scope.go:117] "RemoveContainer" containerID="1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.209859 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4\": container with ID starting with 1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4 not found: ID does not exist" containerID="1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.209916 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4"} err="failed to get container status \"1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4\": rpc error: code = NotFound desc = could not find container \"1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4\": container with ID starting with 1bdd53443661b3d09ab5e96e8e836478a653421fc6d6e07af9b94e924d2a9ab4 not found: ID does not exist" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.357693 4911 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.358288 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerName="registry-server" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.358371 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerName="registry-server" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.358433 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerName="extract-utilities" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.358487 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerName="extract-utilities" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.358676 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerName="extract-content" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.358781 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerName="extract-content" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.359007 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" containerName="registry-server" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.359492 4911 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.359604 4911 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.359713 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360407 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807" gracePeriod=15 Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360500 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b" gracePeriod=15 Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360485 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb" gracePeriod=15 Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360598 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360606 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a" gracePeriod=15 Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360637 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360801 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360809 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360817 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360828 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360840 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360847 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360857 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360864 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360875 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360882 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360892 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360898 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360910 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360916 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.360923 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360929 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.360428 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf" gracePeriod=15 Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361087 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361099 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361106 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361129 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361137 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361144 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361153 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361159 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.361261 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361269 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.361369 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.363883 4911 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.402347 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.446811 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.447179 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.498432 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.498514 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.498553 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.498596 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.498709 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.498824 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.498982 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.499282 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601455 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601537 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601561 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601580 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601600 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601628 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601665 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601627 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601739 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601742 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601698 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601743 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601594 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601750 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601681 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.601798 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: I0310 14:06:45.691851 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:06:45 crc kubenswrapper[4911]: W0310 14:06:45.740543 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a0ca5db49531aa0291ba2e21a0726790604cdfb1980307ddccd7b2c0e308aeab WatchSource:0}: Error finding container a0ca5db49531aa0291ba2e21a0726790604cdfb1980307ddccd7b2c0e308aeab: Status 404 returned error can't find the container with id a0ca5db49531aa0291ba2e21a0726790604cdfb1980307ddccd7b2c0e308aeab Mar 10 14:06:45 crc kubenswrapper[4911]: E0310 14:06:45.747238 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.153:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b7ffda96ad0d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:06:45.74622536 +0000 UTC m=+310.309745287,LastTimestamp:2026-03-10 14:06:45.74622536 +0000 UTC m=+310.309745287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.142835 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"224c49a96b06277acf0c842176bcc63806b7e8f293c981d4a0d38dd2bd040ae3"} Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.142909 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a0ca5db49531aa0291ba2e21a0726790604cdfb1980307ddccd7b2c0e308aeab"} Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.143802 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.144291 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.145528 4911 generic.go:334] "Generic (PLEG): container finished" podID="7365bf68-b4d4-4df4-8e73-6c2336e58792" containerID="efbeefe26f2d4b1d8d43736ff1c2fe5470a3df1ce5c40ec70540d9ab83dd5cfe" exitCode=0 Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.145573 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7365bf68-b4d4-4df4-8e73-6c2336e58792","Type":"ContainerDied","Data":"efbeefe26f2d4b1d8d43736ff1c2fe5470a3df1ce5c40ec70540d9ab83dd5cfe"} Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.146279 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.146648 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.147410 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.151726 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.153632 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.154523 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb" exitCode=0 Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.154554 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf" exitCode=0 Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.154565 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b" exitCode=0 Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.154579 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a" exitCode=2 Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.154630 4911 scope.go:117] "RemoveContainer" containerID="75c6f4043c552fe3c31e15d6fc7d82444319af10b8fc33db6a89a25c4fb5a666" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.194132 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.195236 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:46 crc kubenswrapper[4911]: I0310 14:06:46.195753 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.163347 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.547177 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.548458 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.549171 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.549571 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.640183 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7365bf68-b4d4-4df4-8e73-6c2336e58792-kube-api-access\") pod \"7365bf68-b4d4-4df4-8e73-6c2336e58792\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.640310 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-kubelet-dir\") pod \"7365bf68-b4d4-4df4-8e73-6c2336e58792\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.640346 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-var-lock\") pod \"7365bf68-b4d4-4df4-8e73-6c2336e58792\" (UID: \"7365bf68-b4d4-4df4-8e73-6c2336e58792\") " Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.640498 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7365bf68-b4d4-4df4-8e73-6c2336e58792" (UID: "7365bf68-b4d4-4df4-8e73-6c2336e58792"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.640557 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-var-lock" (OuterVolumeSpecName: "var-lock") pod "7365bf68-b4d4-4df4-8e73-6c2336e58792" (UID: "7365bf68-b4d4-4df4-8e73-6c2336e58792"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.640984 4911 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.641032 4911 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7365bf68-b4d4-4df4-8e73-6c2336e58792-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.656381 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7365bf68-b4d4-4df4-8e73-6c2336e58792-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7365bf68-b4d4-4df4-8e73-6c2336e58792" (UID: "7365bf68-b4d4-4df4-8e73-6c2336e58792"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.732876 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.733976 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.734850 4911 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.735612 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.736169 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.736645 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.742559 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7365bf68-b4d4-4df4-8e73-6c2336e58792-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.843285 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.843457 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.843744 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.843953 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.844107 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.844468 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.844656 4911 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.844683 4911 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:47 crc kubenswrapper[4911]: I0310 14:06:47.844695 4911 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.174519 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7365bf68-b4d4-4df4-8e73-6c2336e58792","Type":"ContainerDied","Data":"ed4f5bd1e7d618ef68e89e456fe84b9466f7f2b0af140fa16b9e3e3251763702"} Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.174581 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed4f5bd1e7d618ef68e89e456fe84b9466f7f2b0af140fa16b9e3e3251763702" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.174664 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.179192 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.180444 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807" exitCode=0 Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.180535 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.180559 4911 scope.go:117] "RemoveContainer" containerID="dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.198888 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.199184 4911 scope.go:117] "RemoveContainer" containerID="f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.199307 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.199576 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.199873 4911 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.200139 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.200467 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.200911 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.201217 4911 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.205156 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.215987 4911 scope.go:117] "RemoveContainer" containerID="01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.239666 4911 scope.go:117] "RemoveContainer" containerID="c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.255563 4911 scope.go:117] "RemoveContainer" containerID="593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.273435 4911 scope.go:117] "RemoveContainer" containerID="c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.293434 4911 scope.go:117] "RemoveContainer" containerID="dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb" Mar 10 14:06:48 crc kubenswrapper[4911]: E0310 14:06:48.294149 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\": container with ID starting with dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb not found: ID does not exist" containerID="dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.294191 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb"} err="failed to get container status \"dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\": rpc error: code = NotFound desc = could not find container \"dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb\": container with ID starting with dd7b43c52bea070cf933e8750ef26106c00932bb2b8ae25667a14d14ca92c8cb not found: ID does not exist" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.294253 4911 scope.go:117] "RemoveContainer" containerID="f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf" Mar 10 14:06:48 crc kubenswrapper[4911]: E0310 14:06:48.294778 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\": container with ID starting with f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf not found: ID does not exist" containerID="f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.294858 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf"} err="failed to get container status \"f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\": rpc error: code = NotFound desc = could not find container \"f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf\": container with ID starting with f47b9ec4a72bda1e1eab82eb560b36281d0557eaa53f747f28a80eca1588a2bf not found: ID does not exist" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.294907 4911 scope.go:117] "RemoveContainer" containerID="01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b" Mar 10 14:06:48 crc kubenswrapper[4911]: E0310 14:06:48.295458 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\": container with ID starting with 01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b not found: ID does not exist" containerID="01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.295505 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b"} err="failed to get container status \"01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\": rpc error: code = NotFound desc = could not find container \"01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b\": container with ID starting with 01dc6e93c0077ac9171b96ca941d45fde04a57b48fa980ae3204495916f1fa3b not found: ID does not exist" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.295535 4911 scope.go:117] "RemoveContainer" containerID="c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a" Mar 10 14:06:48 crc kubenswrapper[4911]: E0310 14:06:48.297271 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\": container with ID starting with c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a not found: ID does not exist" containerID="c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.297316 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a"} err="failed to get container status \"c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\": rpc error: code = NotFound desc = could not find container \"c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a\": container with ID starting with c22d756e903abc4962deed7fa792ef5bcec3da92c6120d8d6812e80e0502eb9a not found: ID does not exist" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.297342 4911 scope.go:117] "RemoveContainer" containerID="593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807" Mar 10 14:06:48 crc kubenswrapper[4911]: E0310 14:06:48.299921 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\": container with ID starting with 593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807 not found: ID does not exist" containerID="593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.299973 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807"} err="failed to get container status \"593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\": rpc error: code = NotFound desc = could not find container \"593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807\": container with ID starting with 593bee20ca56768705f9a84d0d05b569f1868458dccef8afad77b82633051807 not found: ID does not exist" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.300007 4911 scope.go:117] "RemoveContainer" containerID="c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12" Mar 10 14:06:48 crc kubenswrapper[4911]: E0310 14:06:48.300590 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\": container with ID starting with c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12 not found: ID does not exist" containerID="c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12" Mar 10 14:06:48 crc kubenswrapper[4911]: I0310 14:06:48.300631 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12"} err="failed to get container status \"c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\": rpc error: code = NotFound desc = could not find container \"c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12\": container with ID starting with c19fc89cf604f608fbc155759a0e0d1e4d29d6f9f7f9e306ee3701a17fc68c12 not found: ID does not exist" Mar 10 14:06:50 crc kubenswrapper[4911]: E0310 14:06:50.512501 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:50 crc kubenswrapper[4911]: E0310 14:06:50.513850 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:50 crc kubenswrapper[4911]: E0310 14:06:50.514548 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:50 crc kubenswrapper[4911]: E0310 14:06:50.515214 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:50 crc kubenswrapper[4911]: E0310 14:06:50.515606 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:50 crc kubenswrapper[4911]: I0310 14:06:50.515653 4911 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 14:06:50 crc kubenswrapper[4911]: E0310 14:06:50.516063 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="200ms" Mar 10 14:06:50 crc kubenswrapper[4911]: E0310 14:06:50.717087 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="400ms" Mar 10 14:06:51 crc kubenswrapper[4911]: E0310 14:06:51.117663 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="800ms" Mar 10 14:06:51 crc kubenswrapper[4911]: E0310 14:06:51.908843 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.153:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b7ffda96ad0d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 14:06:45.74622536 +0000 UTC m=+310.309745287,LastTimestamp:2026-03-10 14:06:45.74622536 +0000 UTC m=+310.309745287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 14:06:51 crc kubenswrapper[4911]: E0310 14:06:51.918676 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="1.6s" Mar 10 14:06:53 crc kubenswrapper[4911]: E0310 14:06:53.520145 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="3.2s" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.192931 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.196377 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.196738 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.197191 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.198017 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.198796 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.199126 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.221854 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.221896 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:06:56 crc kubenswrapper[4911]: E0310 14:06:56.222424 4911 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:56 crc kubenswrapper[4911]: I0310 14:06:56.223171 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:56 crc kubenswrapper[4911]: E0310 14:06:56.722004 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.153:6443: connect: connection refused" interval="6.4s" Mar 10 14:06:57 crc kubenswrapper[4911]: I0310 14:06:57.240216 4911 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="34fc0dac9b8d384c352a6d35db00f216e6893c80a2878c70503b336629170abb" exitCode=0 Mar 10 14:06:57 crc kubenswrapper[4911]: I0310 14:06:57.240304 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"34fc0dac9b8d384c352a6d35db00f216e6893c80a2878c70503b336629170abb"} Mar 10 14:06:57 crc kubenswrapper[4911]: I0310 14:06:57.240426 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"56a0e3d693652a7648cf82af9adacb8726bb003ce99af2e97e1d67f9b9996d45"} Mar 10 14:06:57 crc kubenswrapper[4911]: I0310 14:06:57.241051 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:06:57 crc kubenswrapper[4911]: I0310 14:06:57.241074 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:06:57 crc kubenswrapper[4911]: I0310 14:06:57.241582 4911 status_manager.go:851] "Failed to get status for pod" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:57 crc kubenswrapper[4911]: E0310 14:06:57.241750 4911 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:06:57 crc kubenswrapper[4911]: I0310 14:06:57.242216 4911 status_manager.go:851] "Failed to get status for pod" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" pod="openshift-marketplace/certified-operators-cvtlp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cvtlp\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:57 crc kubenswrapper[4911]: I0310 14:06:57.242652 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.153:6443: connect: connection refused" Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.197476 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.197803 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.197844 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.197882 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.249852 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ca4abfc6a339d0bfcdc09c5f55dc0cfc4571e877c67db67b0e311aecc153a69b"} Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.249913 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"be7e0499042fdf3aa86cb7181182ed59eee58b331a0812adfb9046880643227d"} Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.249927 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4bffb5ad524b2edbedf3113f6088bf1109d2d488cc2f640c7f9c57761f9093b3"} Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.249939 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2e31f1ecab0c8dcf7b2c206e6f42edfd6d09b8a7d7cbc6ef9902ef370c5d785c"} Mar 10 14:06:58 crc kubenswrapper[4911]: I0310 14:06:58.400608 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:06:59 crc kubenswrapper[4911]: E0310 14:06:59.198913 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 14:06:59 crc kubenswrapper[4911]: E0310 14:06:59.199598 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 14:06:59 crc kubenswrapper[4911]: E0310 14:06:59.199664 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 10 14:06:59 crc kubenswrapper[4911]: E0310 14:06:59.199733 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:09:01.199705335 +0000 UTC m=+445.763225252 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 10 14:06:59 crc kubenswrapper[4911]: E0310 14:06:59.199783 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 14:06:59 crc kubenswrapper[4911]: E0310 14:06:59.199816 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 14:09:01.199809827 +0000 UTC m=+445.763329744 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 10 14:06:59 crc kubenswrapper[4911]: I0310 14:06:59.264836 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dceb52a5e304f5ff26d6765490ba2fa0791269170c66b10febdc644251e56b1e"} Mar 10 14:06:59 crc kubenswrapper[4911]: I0310 14:06:59.265450 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:06:59 crc kubenswrapper[4911]: I0310 14:06:59.265484 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:06:59 crc kubenswrapper[4911]: I0310 14:06:59.268259 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 14:06:59 crc kubenswrapper[4911]: I0310 14:06:59.269170 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 14:06:59 crc kubenswrapper[4911]: I0310 14:06:59.269263 4911 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9" exitCode=1 Mar 10 14:06:59 crc kubenswrapper[4911]: I0310 14:06:59.269310 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9"} Mar 10 14:06:59 crc kubenswrapper[4911]: I0310 14:06:59.270185 4911 scope.go:117] "RemoveContainer" containerID="711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9" Mar 10 14:06:59 crc kubenswrapper[4911]: E0310 14:06:59.401497 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: failed to sync secret cache: timed out waiting for the condition Mar 10 14:06:59 crc kubenswrapper[4911]: E0310 14:06:59.401623 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs podName:d7a44efc-20ad-4c01-9606-e6fdb5e0c721 nodeName:}" failed. No retries permitted until 2026-03-10 14:09:01.40159552 +0000 UTC m=+445.965115457 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs") pod "network-metrics-daemon-r28f8" (UID: "d7a44efc-20ad-4c01-9606-e6fdb5e0c721") : failed to sync secret cache: timed out waiting for the condition Mar 10 14:07:00 crc kubenswrapper[4911]: I0310 14:07:00.031272 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:07:00 crc kubenswrapper[4911]: E0310 14:07:00.199291 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 14:07:00 crc kubenswrapper[4911]: E0310 14:07:00.199330 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 10 14:07:00 crc kubenswrapper[4911]: E0310 14:07:00.199412 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 14:09:02.199388194 +0000 UTC m=+446.762908111 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 10 14:07:00 crc kubenswrapper[4911]: E0310 14:07:00.199862 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 14:07:00 crc kubenswrapper[4911]: E0310 14:07:00.199899 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 10 14:07:00 crc kubenswrapper[4911]: E0310 14:07:00.199989 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 14:09:02.199961688 +0000 UTC m=+446.763481595 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 10 14:07:00 crc kubenswrapper[4911]: I0310 14:07:00.281000 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 14:07:00 crc kubenswrapper[4911]: I0310 14:07:00.281579 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 14:07:00 crc kubenswrapper[4911]: I0310 14:07:00.281640 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6e4b86e710d4c726b7493f91a8a971cdd50cad2c81ccec34db0170b726ad87bf"} Mar 10 14:07:00 crc kubenswrapper[4911]: I0310 14:07:00.579363 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:07:00 crc kubenswrapper[4911]: I0310 14:07:00.579926 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 14:07:00 crc kubenswrapper[4911]: I0310 14:07:00.580035 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 14:07:00 crc kubenswrapper[4911]: I0310 14:07:00.985959 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:07:01 crc kubenswrapper[4911]: I0310 14:07:01.224566 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:07:01 crc kubenswrapper[4911]: I0310 14:07:01.224641 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:07:01 crc kubenswrapper[4911]: I0310 14:07:01.232100 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:07:03 crc kubenswrapper[4911]: I0310 14:07:03.954154 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 14:07:03 crc kubenswrapper[4911]: I0310 14:07:03.954171 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 14:07:03 crc kubenswrapper[4911]: I0310 14:07:03.954238 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 14:07:03 crc kubenswrapper[4911]: I0310 14:07:03.954372 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 14:07:04 crc kubenswrapper[4911]: I0310 14:07:04.203165 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 14:07:04 crc kubenswrapper[4911]: I0310 14:07:04.277256 4911 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:07:04 crc kubenswrapper[4911]: I0310 14:07:04.309249 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:07:04 crc kubenswrapper[4911]: I0310 14:07:04.309292 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:07:04 crc kubenswrapper[4911]: I0310 14:07:04.309886 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:07:04 crc kubenswrapper[4911]: I0310 14:07:04.321353 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:07:05 crc kubenswrapper[4911]: E0310 14:07:05.221567 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 14:07:05 crc kubenswrapper[4911]: E0310 14:07:05.238820 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 14:07:05 crc kubenswrapper[4911]: E0310 14:07:05.245587 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-r28f8" podUID="d7a44efc-20ad-4c01-9606-e6fdb5e0c721" Mar 10 14:07:05 crc kubenswrapper[4911]: E0310 14:07:05.253299 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 14:07:05 crc kubenswrapper[4911]: I0310 14:07:05.316471 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:07:05 crc kubenswrapper[4911]: I0310 14:07:05.316530 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:07:06 crc kubenswrapper[4911]: I0310 14:07:06.206274 4911 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ac154782-fb28-4a7b-8c7f-ed380597be5e" Mar 10 14:07:06 crc kubenswrapper[4911]: I0310 14:07:06.321582 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:07:06 crc kubenswrapper[4911]: I0310 14:07:06.321622 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:07:06 crc kubenswrapper[4911]: I0310 14:07:06.325867 4911 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ac154782-fb28-4a7b-8c7f-ed380597be5e" Mar 10 14:07:06 crc kubenswrapper[4911]: I0310 14:07:06.597726 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" podUID="2160b26f-7876-42d1-8d78-22f6f57cb08e" containerName="oauth-openshift" containerID="cri-o://c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa" gracePeriod=15 Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.054483 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.255387 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-trusted-ca-bundle\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.255927 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-session\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256030 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-provider-selection\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256137 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-idp-0-file-data\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256306 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-login\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256414 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxrsh\" (UniqueName: \"kubernetes.io/projected/2160b26f-7876-42d1-8d78-22f6f57cb08e-kube-api-access-sxrsh\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256507 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-dir\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256589 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-serving-cert\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256685 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-policies\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256776 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-router-certs\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256890 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-cliconfig\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256995 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-ocp-branding-template\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.257079 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-service-ca\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.257162 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-error\") pod \"2160b26f-7876-42d1-8d78-22f6f57cb08e\" (UID: \"2160b26f-7876-42d1-8d78-22f6f57cb08e\") " Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256768 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.256836 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.257647 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.257745 4911 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.258711 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.259000 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.259073 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.264008 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2160b26f-7876-42d1-8d78-22f6f57cb08e-kube-api-access-sxrsh" (OuterVolumeSpecName: "kube-api-access-sxrsh") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "kube-api-access-sxrsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.264135 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.264941 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.265098 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.265340 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.265570 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.266050 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.266673 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.279664 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2160b26f-7876-42d1-8d78-22f6f57cb08e" (UID: "2160b26f-7876-42d1-8d78-22f6f57cb08e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.332781 4911 generic.go:334] "Generic (PLEG): container finished" podID="2160b26f-7876-42d1-8d78-22f6f57cb08e" containerID="c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa" exitCode=0 Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.332864 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" event={"ID":"2160b26f-7876-42d1-8d78-22f6f57cb08e","Type":"ContainerDied","Data":"c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa"} Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.332916 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.332949 4911 scope.go:117] "RemoveContainer" containerID="c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.332933 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7tfvh" event={"ID":"2160b26f-7876-42d1-8d78-22f6f57cb08e","Type":"ContainerDied","Data":"6807294a814c105ef7219b233c1cc643734827f9fd24d91dfce76bae3a18c60e"} Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.355388 4911 scope.go:117] "RemoveContainer" containerID="c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa" Mar 10 14:07:07 crc kubenswrapper[4911]: E0310 14:07:07.357264 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa\": container with ID starting with c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa not found: ID does not exist" containerID="c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.357315 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa"} err="failed to get container status \"c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa\": rpc error: code = NotFound desc = could not find container \"c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa\": container with ID starting with c2658d1348fdb37bcacc94d20deecf3f831747fe131896e4d75201a3bfd981aa not found: ID does not exist" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359334 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359379 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359400 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359415 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359428 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359442 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359456 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359472 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359484 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxrsh\" (UniqueName: \"kubernetes.io/projected/2160b26f-7876-42d1-8d78-22f6f57cb08e-kube-api-access-sxrsh\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359496 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359509 4911 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2160b26f-7876-42d1-8d78-22f6f57cb08e-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:07 crc kubenswrapper[4911]: I0310 14:07:07.359526 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2160b26f-7876-42d1-8d78-22f6f57cb08e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:10 crc kubenswrapper[4911]: I0310 14:07:10.579681 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 14:07:10 crc kubenswrapper[4911]: I0310 14:07:10.580105 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 14:07:10 crc kubenswrapper[4911]: I0310 14:07:10.776566 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 14:07:11 crc kubenswrapper[4911]: I0310 14:07:11.378254 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 14:07:11 crc kubenswrapper[4911]: I0310 14:07:11.636292 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 14:07:12 crc kubenswrapper[4911]: I0310 14:07:12.779677 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 14:07:13 crc kubenswrapper[4911]: I0310 14:07:13.356454 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 14:07:14 crc kubenswrapper[4911]: I0310 14:07:14.233609 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 14:07:14 crc kubenswrapper[4911]: I0310 14:07:14.273359 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 14:07:15 crc kubenswrapper[4911]: I0310 14:07:15.376666 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 14:07:15 crc kubenswrapper[4911]: I0310 14:07:15.669862 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 14:07:16 crc kubenswrapper[4911]: I0310 14:07:16.131397 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 14:07:16 crc kubenswrapper[4911]: I0310 14:07:16.291958 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 14:07:16 crc kubenswrapper[4911]: I0310 14:07:16.389952 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 14:07:16 crc kubenswrapper[4911]: I0310 14:07:16.490261 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 14:07:16 crc kubenswrapper[4911]: I0310 14:07:16.795919 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 14:07:16 crc kubenswrapper[4911]: I0310 14:07:16.967095 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 14:07:16 crc kubenswrapper[4911]: I0310 14:07:16.971865 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 14:07:17 crc kubenswrapper[4911]: I0310 14:07:17.322285 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 14:07:17 crc kubenswrapper[4911]: I0310 14:07:17.324274 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 14:07:17 crc kubenswrapper[4911]: I0310 14:07:17.624396 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 14:07:17 crc kubenswrapper[4911]: I0310 14:07:17.763804 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 14:07:17 crc kubenswrapper[4911]: I0310 14:07:17.789909 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 14:07:17 crc kubenswrapper[4911]: I0310 14:07:17.825441 4911 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 14:07:17 crc kubenswrapper[4911]: I0310 14:07:17.937300 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.066625 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.090415 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.180279 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.190921 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.193120 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.365023 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.414699 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.421623 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.574854 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.611964 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.615188 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.770535 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.794270 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.819161 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.822598 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.840755 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 14:07:18 crc kubenswrapper[4911]: I0310 14:07:18.962609 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.068091 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.152919 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.173161 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.192435 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.209224 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.234190 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.451482 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.452577 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.606802 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.680229 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 14:07:19 crc kubenswrapper[4911]: I0310 14:07:19.895243 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.033135 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.050400 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.105800 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.192698 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.192717 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.256278 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.294131 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.485218 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.580048 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.580407 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.580565 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.581235 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.581557 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"6e4b86e710d4c726b7493f91a8a971cdd50cad2c81ccec34db0170b726ad87bf"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.581794 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://6e4b86e710d4c726b7493f91a8a971cdd50cad2c81ccec34db0170b726ad87bf" gracePeriod=30 Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.654780 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.683935 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.703420 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.711484 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.733591 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.748417 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.952241 4911 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 14:07:20 crc kubenswrapper[4911]: I0310 14:07:20.982085 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.007095 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.082993 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.097458 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.098488 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.149968 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.198106 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.323994 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.334457 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.336672 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.339250 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.460338 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.558087 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.576952 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.587526 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.683673 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.744941 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.877165 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 14:07:21 crc kubenswrapper[4911]: I0310 14:07:21.959289 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.159152 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.160762 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.169894 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.177760 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.242448 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.250522 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.258881 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.309779 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.324199 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.421082 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.625247 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.630486 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.652839 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.705484 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.731904 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.731954 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.745899 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.755843 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.779470 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.821920 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.847118 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.951101 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 14:07:22 crc kubenswrapper[4911]: I0310 14:07:22.982940 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.003715 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.029450 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.046823 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.048584 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.099090 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.128642 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.312631 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.426319 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.471551 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.536898 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.594535 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.628639 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.664444 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.669398 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.676210 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.815084 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.818669 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.838322 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.893635 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 14:07:23 crc kubenswrapper[4911]: I0310 14:07:23.943553 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.100224 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.100551 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.158257 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.171702 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.190956 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.277060 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.306922 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.460390 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.488092 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.506564 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.534192 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.557057 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.582355 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.625597 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.635176 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.679430 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.819459 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.823220 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 14:07:24 crc kubenswrapper[4911]: I0310 14:07:24.972081 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.078642 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.089632 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.149259 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.176759 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.258142 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.259591 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.276027 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.306529 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.383504 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.438846 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.490958 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.517921 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.522629 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.548767 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.551453 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.630861 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.755275 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 14:07:25 crc kubenswrapper[4911]: I0310 14:07:25.955926 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.105790 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.131381 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.263661 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.274416 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.282622 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.448638 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.456583 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.497430 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.723371 4911 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.726090 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.726054962 podStartE2EDuration="41.726054962s" podCreationTimestamp="2026-03-10 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:07:04.339575278 +0000 UTC m=+328.903095195" watchObservedRunningTime="2026-03-10 14:07:26.726054962 +0000 UTC m=+351.289574919" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.732114 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7tfvh","openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/certified-operators-cvtlp"] Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.732219 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j","openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 14:07:26 crc kubenswrapper[4911]: E0310 14:07:26.732521 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" containerName="installer" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.732550 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" containerName="installer" Mar 10 14:07:26 crc kubenswrapper[4911]: E0310 14:07:26.732582 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2160b26f-7876-42d1-8d78-22f6f57cb08e" containerName="oauth-openshift" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.732595 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="2160b26f-7876-42d1-8d78-22f6f57cb08e" containerName="oauth-openshift" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.732944 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="2160b26f-7876-42d1-8d78-22f6f57cb08e" containerName="oauth-openshift" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.732966 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7365bf68-b4d4-4df4-8e73-6c2336e58792" containerName="installer" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.733013 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.733053 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c243d166-4ad1-46ec-ac74-f5f55b7e0fb3" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.733623 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.741142 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.741169 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.741196 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.741286 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.741377 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.741433 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.741495 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.741540 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742336 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742412 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-error\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742463 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742500 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742537 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742585 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-login\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742660 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742754 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-audit-policies\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742800 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.742839 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksfb\" (UniqueName: \"kubernetes.io/projected/ab315b4d-d7ab-497c-969e-169147910d15-kube-api-access-qksfb\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.743154 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.743368 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.743549 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ab315b4d-d7ab-497c-969e-169147910d15-audit-dir\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.743888 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-session\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.743970 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.744017 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.744133 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.747591 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.752040 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.759145 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.765475 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.791072 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.845840 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.846271 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-login\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.846306 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.846335 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-audit-policies\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.847557 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.847668 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qksfb\" (UniqueName: \"kubernetes.io/projected/ab315b4d-d7ab-497c-969e-169147910d15-kube-api-access-qksfb\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.847923 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848005 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ab315b4d-d7ab-497c-969e-169147910d15-audit-dir\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848070 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-session\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848164 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848217 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848332 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848383 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-audit-policies\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848523 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848559 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-error\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.848220 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ab315b4d-d7ab-497c-969e-169147910d15-audit-dir\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.849249 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.849285 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.849848 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.852974 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.857502 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-login\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.858254 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.858782 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-session\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.862373 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.866424 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.866408236 podStartE2EDuration="22.866408236s" podCreationTimestamp="2026-03-10 14:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:07:26.835556243 +0000 UTC m=+351.399076180" watchObservedRunningTime="2026-03-10 14:07:26.866408236 +0000 UTC m=+351.429928153" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.867471 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qksfb\" (UniqueName: \"kubernetes.io/projected/ab315b4d-d7ab-497c-969e-169147910d15-kube-api-access-qksfb\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.869042 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.869289 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.871212 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ab315b4d-d7ab-497c-969e-169147910d15-v4-0-config-user-template-error\") pod \"oauth-openshift-79d66fd6fb-fsp5j\" (UID: \"ab315b4d-d7ab-497c-969e-169147910d15\") " pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:26 crc kubenswrapper[4911]: I0310 14:07:26.937197 4911 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.038673 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.054311 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.058888 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.210530 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.245187 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.245357 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.316021 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.333673 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.348176 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.470663 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.595050 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.703011 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.820817 4911 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.891927 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j"] Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.909025 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.959758 4911 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 14:07:27 crc kubenswrapper[4911]: I0310 14:07:27.996522 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.201781 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2160b26f-7876-42d1-8d78-22f6f57cb08e" path="/var/lib/kubelet/pods/2160b26f-7876-42d1-8d78-22f6f57cb08e/volumes" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.203019 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b986951d-80c8-4f06-a12b-9dd8047a7bf5" path="/var/lib/kubelet/pods/b986951d-80c8-4f06-a12b-9dd8047a7bf5/volumes" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.226748 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.345117 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.472616 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" event={"ID":"ab315b4d-d7ab-497c-969e-169147910d15","Type":"ContainerStarted","Data":"d76d629c4df9f1cfb201e19e3ae236583612754b8197b369afd03bae7cc26100"} Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.472670 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" event={"ID":"ab315b4d-d7ab-497c-969e-169147910d15","Type":"ContainerStarted","Data":"339960481e0455df0da7bb6932a50b71e82266af321c66672d92866658dfe5d4"} Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.507174 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" podStartSLOduration=47.507149436 podStartE2EDuration="47.507149436s" podCreationTimestamp="2026-03-10 14:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:07:28.503507311 +0000 UTC m=+353.067027238" watchObservedRunningTime="2026-03-10 14:07:28.507149436 +0000 UTC m=+353.070669353" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.566004 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.586901 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.593547 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.613228 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.695681 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.770210 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.773023 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.780884 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.890601 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.989913 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 14:07:28 crc kubenswrapper[4911]: I0310 14:07:28.992827 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.081242 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.090067 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.233065 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.288202 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.480721 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.487286 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79d66fd6fb-fsp5j" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.617976 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.619417 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.672173 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.943239 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 14:07:29 crc kubenswrapper[4911]: I0310 14:07:29.943913 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.114774 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.115546 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.116256 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.368237 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.430683 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.540592 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.603121 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.625762 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.773977 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 14:07:30 crc kubenswrapper[4911]: I0310 14:07:30.919283 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 14:07:31 crc kubenswrapper[4911]: I0310 14:07:31.004297 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 14:07:31 crc kubenswrapper[4911]: I0310 14:07:31.041038 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 14:07:31 crc kubenswrapper[4911]: I0310 14:07:31.288998 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 14:07:31 crc kubenswrapper[4911]: I0310 14:07:31.382877 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 14:07:31 crc kubenswrapper[4911]: I0310 14:07:31.410539 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 14:07:31 crc kubenswrapper[4911]: I0310 14:07:31.427630 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 14:07:31 crc kubenswrapper[4911]: I0310 14:07:31.432595 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 14:07:31 crc kubenswrapper[4911]: I0310 14:07:31.916566 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 14:07:32 crc kubenswrapper[4911]: I0310 14:07:32.178527 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 14:07:32 crc kubenswrapper[4911]: I0310 14:07:32.477553 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 14:07:32 crc kubenswrapper[4911]: I0310 14:07:32.492512 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 14:07:32 crc kubenswrapper[4911]: I0310 14:07:32.496478 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 14:07:38 crc kubenswrapper[4911]: I0310 14:07:38.048725 4911 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 14:07:38 crc kubenswrapper[4911]: I0310 14:07:38.049548 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://224c49a96b06277acf0c842176bcc63806b7e8f293c981d4a0d38dd2bd040ae3" gracePeriod=5 Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.583402 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.584181 4911 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="224c49a96b06277acf0c842176bcc63806b7e8f293c981d4a0d38dd2bd040ae3" exitCode=137 Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.641580 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.641676 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.723717 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.723809 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.723915 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.723958 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.724036 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.724133 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.724087 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.724337 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.724169 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.725564 4911 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.725660 4911 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.725743 4911 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.725815 4911 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.731359 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:07:43 crc kubenswrapper[4911]: I0310 14:07:43.827324 4911 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.202371 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.203918 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.235924 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.235974 4911 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="10aaf8e8-bb20-4786-9a03-9cb4aa86b713" Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.235997 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.236008 4911 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="10aaf8e8-bb20-4786-9a03-9cb4aa86b713" Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.593391 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.593621 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 14:07:44 crc kubenswrapper[4911]: I0310 14:07:44.593716 4911 scope.go:117] "RemoveContainer" containerID="224c49a96b06277acf0c842176bcc63806b7e8f293c981d4a0d38dd2bd040ae3" Mar 10 14:07:50 crc kubenswrapper[4911]: I0310 14:07:50.632262 4911 generic.go:334] "Generic (PLEG): container finished" podID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerID="cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b" exitCode=0 Mar 10 14:07:50 crc kubenswrapper[4911]: I0310 14:07:50.632349 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" event={"ID":"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02","Type":"ContainerDied","Data":"cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b"} Mar 10 14:07:50 crc kubenswrapper[4911]: I0310 14:07:50.633307 4911 scope.go:117] "RemoveContainer" containerID="cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b" Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.640437 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.642464 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.643527 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.643604 4911 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6e4b86e710d4c726b7493f91a8a971cdd50cad2c81ccec34db0170b726ad87bf" exitCode=137 Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.643671 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6e4b86e710d4c726b7493f91a8a971cdd50cad2c81ccec34db0170b726ad87bf"} Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.643743 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"989521b42d21a6c4756b5bea0216923e6bd1416014bad51fceccc7bb7daac165"} Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.643771 4911 scope.go:117] "RemoveContainer" containerID="711b3fd3eea541ea7887f2ffe296158f7cafe474a5b469943239d1c64d044be9" Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.651224 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" event={"ID":"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02","Type":"ContainerStarted","Data":"a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e"} Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.651944 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:07:51 crc kubenswrapper[4911]: I0310 14:07:51.656152 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:07:52 crc kubenswrapper[4911]: I0310 14:07:52.660586 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 10 14:07:52 crc kubenswrapper[4911]: I0310 14:07:52.662296 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 14:08:00 crc kubenswrapper[4911]: I0310 14:08:00.578979 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:08:00 crc kubenswrapper[4911]: I0310 14:08:00.583778 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:08:00 crc kubenswrapper[4911]: I0310 14:08:00.709969 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:08:00 crc kubenswrapper[4911]: I0310 14:08:00.715765 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 14:08:10 crc kubenswrapper[4911]: I0310 14:08:10.732261 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9pm6m"] Mar 10 14:08:10 crc kubenswrapper[4911]: I0310 14:08:10.732805 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9pm6m" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="registry-server" containerID="cri-o://7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2" gracePeriod=2 Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.051169 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552528-m6klh"] Mar 10 14:08:11 crc kubenswrapper[4911]: E0310 14:08:11.051425 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.051438 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.051544 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.052011 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552528-m6klh" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.054090 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.054206 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.054594 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.061864 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552528-m6klh"] Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.124653 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq"] Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.125114 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" podUID="44c44298-3652-4f8e-af18-2e038a8e5ecd" containerName="route-controller-manager" containerID="cri-o://4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc" gracePeriod=30 Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.140130 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-759b87b8f8-q4nj8"] Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.140387 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" podUID="8bff43b0-a953-404b-a957-2afab9373552" containerName="controller-manager" containerID="cri-o://5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1" gracePeriod=30 Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.152724 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgh5k\" (UniqueName: \"kubernetes.io/projected/bbc69578-9347-4984-af7d-e05aa9abd29d-kube-api-access-pgh5k\") pod \"auto-csr-approver-29552528-m6klh\" (UID: \"bbc69578-9347-4984-af7d-e05aa9abd29d\") " pod="openshift-infra/auto-csr-approver-29552528-m6klh" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.254060 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgh5k\" (UniqueName: \"kubernetes.io/projected/bbc69578-9347-4984-af7d-e05aa9abd29d-kube-api-access-pgh5k\") pod \"auto-csr-approver-29552528-m6klh\" (UID: \"bbc69578-9347-4984-af7d-e05aa9abd29d\") " pod="openshift-infra/auto-csr-approver-29552528-m6klh" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.288708 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgh5k\" (UniqueName: \"kubernetes.io/projected/bbc69578-9347-4984-af7d-e05aa9abd29d-kube-api-access-pgh5k\") pod \"auto-csr-approver-29552528-m6klh\" (UID: \"bbc69578-9347-4984-af7d-e05aa9abd29d\") " pod="openshift-infra/auto-csr-approver-29552528-m6klh" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.355109 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.374665 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552528-m6klh" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.456581 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-utilities\") pod \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.456644 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ml8k\" (UniqueName: \"kubernetes.io/projected/f11c754a-10a0-46ac-b171-5ccfecebdb7c-kube-api-access-8ml8k\") pod \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.456664 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-catalog-content\") pod \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\" (UID: \"f11c754a-10a0-46ac-b171-5ccfecebdb7c\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.457898 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-utilities" (OuterVolumeSpecName: "utilities") pod "f11c754a-10a0-46ac-b171-5ccfecebdb7c" (UID: "f11c754a-10a0-46ac-b171-5ccfecebdb7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.471486 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11c754a-10a0-46ac-b171-5ccfecebdb7c-kube-api-access-8ml8k" (OuterVolumeSpecName: "kube-api-access-8ml8k") pod "f11c754a-10a0-46ac-b171-5ccfecebdb7c" (UID: "f11c754a-10a0-46ac-b171-5ccfecebdb7c"). InnerVolumeSpecName "kube-api-access-8ml8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.559133 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ml8k\" (UniqueName: \"kubernetes.io/projected/f11c754a-10a0-46ac-b171-5ccfecebdb7c-kube-api-access-8ml8k\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.559433 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.640048 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f11c754a-10a0-46ac-b171-5ccfecebdb7c" (UID: "f11c754a-10a0-46ac-b171-5ccfecebdb7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.660883 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11c754a-10a0-46ac-b171-5ccfecebdb7c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.674518 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.761904 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-config\") pod \"8bff43b0-a953-404b-a957-2afab9373552\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.761977 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6wkp\" (UniqueName: \"kubernetes.io/projected/8bff43b0-a953-404b-a957-2afab9373552-kube-api-access-x6wkp\") pod \"8bff43b0-a953-404b-a957-2afab9373552\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.762028 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-proxy-ca-bundles\") pod \"8bff43b0-a953-404b-a957-2afab9373552\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.762054 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bff43b0-a953-404b-a957-2afab9373552-serving-cert\") pod \"8bff43b0-a953-404b-a957-2afab9373552\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.762094 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-client-ca\") pod \"8bff43b0-a953-404b-a957-2afab9373552\" (UID: \"8bff43b0-a953-404b-a957-2afab9373552\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.763052 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-client-ca" (OuterVolumeSpecName: "client-ca") pod "8bff43b0-a953-404b-a957-2afab9373552" (UID: "8bff43b0-a953-404b-a957-2afab9373552"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.763071 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8bff43b0-a953-404b-a957-2afab9373552" (UID: "8bff43b0-a953-404b-a957-2afab9373552"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.763291 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-config" (OuterVolumeSpecName: "config") pod "8bff43b0-a953-404b-a957-2afab9373552" (UID: "8bff43b0-a953-404b-a957-2afab9373552"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.767079 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bff43b0-a953-404b-a957-2afab9373552-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8bff43b0-a953-404b-a957-2afab9373552" (UID: "8bff43b0-a953-404b-a957-2afab9373552"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.767581 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bff43b0-a953-404b-a957-2afab9373552-kube-api-access-x6wkp" (OuterVolumeSpecName: "kube-api-access-x6wkp") pod "8bff43b0-a953-404b-a957-2afab9373552" (UID: "8bff43b0-a953-404b-a957-2afab9373552"). InnerVolumeSpecName "kube-api-access-x6wkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.769880 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.779066 4911 generic.go:334] "Generic (PLEG): container finished" podID="44c44298-3652-4f8e-af18-2e038a8e5ecd" containerID="4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc" exitCode=0 Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.779162 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" event={"ID":"44c44298-3652-4f8e-af18-2e038a8e5ecd","Type":"ContainerDied","Data":"4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc"} Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.779199 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" event={"ID":"44c44298-3652-4f8e-af18-2e038a8e5ecd","Type":"ContainerDied","Data":"ea8e75bbd11215fee96c2a0c2d26b1e8a26ecd881760a313a1163c9408f51753"} Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.779220 4911 scope.go:117] "RemoveContainer" containerID="4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.779351 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.783113 4911 generic.go:334] "Generic (PLEG): container finished" podID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerID="7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2" exitCode=0 Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.783216 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pm6m" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.783240 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pm6m" event={"ID":"f11c754a-10a0-46ac-b171-5ccfecebdb7c","Type":"ContainerDied","Data":"7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2"} Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.783284 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pm6m" event={"ID":"f11c754a-10a0-46ac-b171-5ccfecebdb7c","Type":"ContainerDied","Data":"ba29199479e057e8f57a4a51909765322ae20b00593013ae73bd44b96e0423e8"} Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.785194 4911 generic.go:334] "Generic (PLEG): container finished" podID="8bff43b0-a953-404b-a957-2afab9373552" containerID="5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1" exitCode=0 Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.785242 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" event={"ID":"8bff43b0-a953-404b-a957-2afab9373552","Type":"ContainerDied","Data":"5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1"} Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.785267 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" event={"ID":"8bff43b0-a953-404b-a957-2afab9373552","Type":"ContainerDied","Data":"3b35a06e4ed3eb634246aca3e5e29efafcba293be0989a4619b2cb4cb618702b"} Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.785311 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759b87b8f8-q4nj8" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.814225 4911 scope.go:117] "RemoveContainer" containerID="4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc" Mar 10 14:08:11 crc kubenswrapper[4911]: E0310 14:08:11.818468 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc\": container with ID starting with 4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc not found: ID does not exist" containerID="4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.818523 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc"} err="failed to get container status \"4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc\": rpc error: code = NotFound desc = could not find container \"4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc\": container with ID starting with 4a92847375c640e192430b5d0ae2b70162c0eed792ad7a338a7e6a82a95b8dbc not found: ID does not exist" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.818581 4911 scope.go:117] "RemoveContainer" containerID="7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.843892 4911 scope.go:117] "RemoveContainer" containerID="bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.861132 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9pm6m"] Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.870886 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9pm6m"] Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.872535 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-config\") pod \"44c44298-3652-4f8e-af18-2e038a8e5ecd\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.872639 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c44298-3652-4f8e-af18-2e038a8e5ecd-serving-cert\") pod \"44c44298-3652-4f8e-af18-2e038a8e5ecd\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.872750 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-client-ca\") pod \"44c44298-3652-4f8e-af18-2e038a8e5ecd\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.872786 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdhwk\" (UniqueName: \"kubernetes.io/projected/44c44298-3652-4f8e-af18-2e038a8e5ecd-kube-api-access-fdhwk\") pod \"44c44298-3652-4f8e-af18-2e038a8e5ecd\" (UID: \"44c44298-3652-4f8e-af18-2e038a8e5ecd\") " Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.873187 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.873205 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6wkp\" (UniqueName: \"kubernetes.io/projected/8bff43b0-a953-404b-a957-2afab9373552-kube-api-access-x6wkp\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.873217 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.873226 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bff43b0-a953-404b-a957-2afab9373552-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.873235 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bff43b0-a953-404b-a957-2afab9373552-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.874051 4911 scope.go:117] "RemoveContainer" containerID="c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.874048 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-config" (OuterVolumeSpecName: "config") pod "44c44298-3652-4f8e-af18-2e038a8e5ecd" (UID: "44c44298-3652-4f8e-af18-2e038a8e5ecd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.874480 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-client-ca" (OuterVolumeSpecName: "client-ca") pod "44c44298-3652-4f8e-af18-2e038a8e5ecd" (UID: "44c44298-3652-4f8e-af18-2e038a8e5ecd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.884928 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c44298-3652-4f8e-af18-2e038a8e5ecd-kube-api-access-fdhwk" (OuterVolumeSpecName: "kube-api-access-fdhwk") pod "44c44298-3652-4f8e-af18-2e038a8e5ecd" (UID: "44c44298-3652-4f8e-af18-2e038a8e5ecd"). InnerVolumeSpecName "kube-api-access-fdhwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.885922 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c44298-3652-4f8e-af18-2e038a8e5ecd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "44c44298-3652-4f8e-af18-2e038a8e5ecd" (UID: "44c44298-3652-4f8e-af18-2e038a8e5ecd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.891818 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-759b87b8f8-q4nj8"] Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.895331 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-759b87b8f8-q4nj8"] Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.929774 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552528-m6klh"] Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.947747 4911 scope.go:117] "RemoveContainer" containerID="7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2" Mar 10 14:08:11 crc kubenswrapper[4911]: E0310 14:08:11.948540 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2\": container with ID starting with 7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2 not found: ID does not exist" containerID="7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.948595 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2"} err="failed to get container status \"7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2\": rpc error: code = NotFound desc = could not find container \"7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2\": container with ID starting with 7c83a33d80e30ea0852b2dc96adf31e006ae024598d57cb77131f9948a6f40f2 not found: ID does not exist" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.948628 4911 scope.go:117] "RemoveContainer" containerID="bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa" Mar 10 14:08:11 crc kubenswrapper[4911]: E0310 14:08:11.949383 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa\": container with ID starting with bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa not found: ID does not exist" containerID="bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.949406 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa"} err="failed to get container status \"bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa\": rpc error: code = NotFound desc = could not find container \"bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa\": container with ID starting with bff984a1f671a5639a8ce3c3ddcba8b26d248f1a8bbd0a9b7624277aabe2fcfa not found: ID does not exist" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.949419 4911 scope.go:117] "RemoveContainer" containerID="c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5" Mar 10 14:08:11 crc kubenswrapper[4911]: E0310 14:08:11.950184 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5\": container with ID starting with c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5 not found: ID does not exist" containerID="c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.950208 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5"} err="failed to get container status \"c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5\": rpc error: code = NotFound desc = could not find container \"c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5\": container with ID starting with c0ff428d4d9a33a0a236cf5da43b7e33075248048728b3f19b587f1a8d940ec5 not found: ID does not exist" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.950224 4911 scope.go:117] "RemoveContainer" containerID="5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.975122 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.975600 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c44298-3652-4f8e-af18-2e038a8e5ecd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.975612 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c44298-3652-4f8e-af18-2e038a8e5ecd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.975625 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdhwk\" (UniqueName: \"kubernetes.io/projected/44c44298-3652-4f8e-af18-2e038a8e5ecd-kube-api-access-fdhwk\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.984157 4911 scope.go:117] "RemoveContainer" containerID="5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1" Mar 10 14:08:11 crc kubenswrapper[4911]: E0310 14:08:11.984801 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1\": container with ID starting with 5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1 not found: ID does not exist" containerID="5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1" Mar 10 14:08:11 crc kubenswrapper[4911]: I0310 14:08:11.984852 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1"} err="failed to get container status \"5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1\": rpc error: code = NotFound desc = could not find container \"5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1\": container with ID starting with 5397cc88cac67aae2792224357bcb17cdfd9d1bca5f01132e2a32d017cd9dff1 not found: ID does not exist" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.114192 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq"] Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.119816 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-567c4dddc8-qqjzq"] Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.202352 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c44298-3652-4f8e-af18-2e038a8e5ecd" path="/var/lib/kubelet/pods/44c44298-3652-4f8e-af18-2e038a8e5ecd/volumes" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.203042 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bff43b0-a953-404b-a957-2afab9373552" path="/var/lib/kubelet/pods/8bff43b0-a953-404b-a957-2afab9373552/volumes" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.203593 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" path="/var/lib/kubelet/pods/f11c754a-10a0-46ac-b171-5ccfecebdb7c/volumes" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.804448 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552528-m6klh" event={"ID":"bbc69578-9347-4984-af7d-e05aa9abd29d","Type":"ContainerStarted","Data":"b14abee5ad255bac511a19b8aa3a8bf56980c94b106171824610ceb304dc7148"} Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.817572 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69"] Mar 10 14:08:12 crc kubenswrapper[4911]: E0310 14:08:12.818099 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="extract-content" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.818126 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="extract-content" Mar 10 14:08:12 crc kubenswrapper[4911]: E0310 14:08:12.818148 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="registry-server" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.818162 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="registry-server" Mar 10 14:08:12 crc kubenswrapper[4911]: E0310 14:08:12.818189 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c44298-3652-4f8e-af18-2e038a8e5ecd" containerName="route-controller-manager" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.818204 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c44298-3652-4f8e-af18-2e038a8e5ecd" containerName="route-controller-manager" Mar 10 14:08:12 crc kubenswrapper[4911]: E0310 14:08:12.818233 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bff43b0-a953-404b-a957-2afab9373552" containerName="controller-manager" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.818247 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bff43b0-a953-404b-a957-2afab9373552" containerName="controller-manager" Mar 10 14:08:12 crc kubenswrapper[4911]: E0310 14:08:12.818268 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="extract-utilities" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.818281 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="extract-utilities" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.818479 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bff43b0-a953-404b-a957-2afab9373552" containerName="controller-manager" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.818512 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c44298-3652-4f8e-af18-2e038a8e5ecd" containerName="route-controller-manager" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.818541 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11c754a-10a0-46ac-b171-5ccfecebdb7c" containerName="registry-server" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.819355 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.826876 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7448946b4c-vn78p"] Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.827910 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.833615 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69"] Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.871873 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.872183 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.872407 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.872469 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.872489 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.872714 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.872867 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.873024 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.873194 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.873400 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.873518 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.874300 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.877885 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.878017 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7448946b4c-vn78p"] Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889006 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb92791b-a628-41ac-ad29-7fb59c0c35f8-serving-cert\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889057 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-client-ca\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889152 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-config\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889182 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rcv\" (UniqueName: \"kubernetes.io/projected/bb92791b-a628-41ac-ad29-7fb59c0c35f8-kube-api-access-z9rcv\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889225 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-client-ca\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889263 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzhxf\" (UniqueName: \"kubernetes.io/projected/afbcea8f-10ef-4ba5-a08e-03286fe18a29-kube-api-access-nzhxf\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889301 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-config\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889324 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afbcea8f-10ef-4ba5-a08e-03286fe18a29-serving-cert\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.889354 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-proxy-ca-bundles\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990278 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-proxy-ca-bundles\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990384 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb92791b-a628-41ac-ad29-7fb59c0c35f8-serving-cert\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990443 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-client-ca\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990528 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-config\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990561 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rcv\" (UniqueName: \"kubernetes.io/projected/bb92791b-a628-41ac-ad29-7fb59c0c35f8-kube-api-access-z9rcv\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990607 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-client-ca\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990678 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzhxf\" (UniqueName: \"kubernetes.io/projected/afbcea8f-10ef-4ba5-a08e-03286fe18a29-kube-api-access-nzhxf\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990755 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-config\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.990794 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afbcea8f-10ef-4ba5-a08e-03286fe18a29-serving-cert\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.992114 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-client-ca\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.992328 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-config\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.992485 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-client-ca\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.992868 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-proxy-ca-bundles\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:12 crc kubenswrapper[4911]: I0310 14:08:12.992991 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbcea8f-10ef-4ba5-a08e-03286fe18a29-config\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.006846 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afbcea8f-10ef-4ba5-a08e-03286fe18a29-serving-cert\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.007244 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb92791b-a628-41ac-ad29-7fb59c0c35f8-serving-cert\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.010114 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzhxf\" (UniqueName: \"kubernetes.io/projected/afbcea8f-10ef-4ba5-a08e-03286fe18a29-kube-api-access-nzhxf\") pod \"controller-manager-7448946b4c-vn78p\" (UID: \"afbcea8f-10ef-4ba5-a08e-03286fe18a29\") " pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.013313 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rcv\" (UniqueName: \"kubernetes.io/projected/bb92791b-a628-41ac-ad29-7fb59c0c35f8-kube-api-access-z9rcv\") pod \"route-controller-manager-55985dff9-rwv69\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.199251 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.208857 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.547455 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7448946b4c-vn78p"] Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.695258 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69"] Mar 10 14:08:13 crc kubenswrapper[4911]: W0310 14:08:13.700819 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb92791b_a628_41ac_ad29_7fb59c0c35f8.slice/crio-11a549637352921de486d556e6f3864068994b78b9c81392a345bcb8768b6651 WatchSource:0}: Error finding container 11a549637352921de486d556e6f3864068994b78b9c81392a345bcb8768b6651: Status 404 returned error can't find the container with id 11a549637352921de486d556e6f3864068994b78b9c81392a345bcb8768b6651 Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.812537 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" event={"ID":"afbcea8f-10ef-4ba5-a08e-03286fe18a29","Type":"ContainerStarted","Data":"b332c5f7b5a6bcaaf96414fa44b01ab8847ec43b43058516cb640e1f31f06ad2"} Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.812985 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" event={"ID":"afbcea8f-10ef-4ba5-a08e-03286fe18a29","Type":"ContainerStarted","Data":"81e09794fd820cc3d82781f50ba516fa7772062b75e9e9fb3654246aecd66a5f"} Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.813142 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.814397 4911 generic.go:334] "Generic (PLEG): container finished" podID="bbc69578-9347-4984-af7d-e05aa9abd29d" containerID="f8f6544ff8e5585502ae5f1078798e5b9927b2ad0e12ba5072382cf891245b7e" exitCode=0 Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.814543 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552528-m6klh" event={"ID":"bbc69578-9347-4984-af7d-e05aa9abd29d","Type":"ContainerDied","Data":"f8f6544ff8e5585502ae5f1078798e5b9927b2ad0e12ba5072382cf891245b7e"} Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.816312 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" event={"ID":"bb92791b-a628-41ac-ad29-7fb59c0c35f8","Type":"ContainerStarted","Data":"11a549637352921de486d556e6f3864068994b78b9c81392a345bcb8768b6651"} Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.820635 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" Mar 10 14:08:13 crc kubenswrapper[4911]: I0310 14:08:13.834890 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7448946b4c-vn78p" podStartSLOduration=2.834868653 podStartE2EDuration="2.834868653s" podCreationTimestamp="2026-03-10 14:08:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:08:13.830547989 +0000 UTC m=+398.394067906" watchObservedRunningTime="2026-03-10 14:08:13.834868653 +0000 UTC m=+398.398388570" Mar 10 14:08:14 crc kubenswrapper[4911]: I0310 14:08:14.824689 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" event={"ID":"bb92791b-a628-41ac-ad29-7fb59c0c35f8","Type":"ContainerStarted","Data":"2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a"} Mar 10 14:08:14 crc kubenswrapper[4911]: I0310 14:08:14.847673 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" podStartSLOduration=3.847648197 podStartE2EDuration="3.847648197s" podCreationTimestamp="2026-03-10 14:08:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:08:14.84305112 +0000 UTC m=+399.406571037" watchObservedRunningTime="2026-03-10 14:08:14.847648197 +0000 UTC m=+399.411168114" Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.131415 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552528-m6klh" Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.226922 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgh5k\" (UniqueName: \"kubernetes.io/projected/bbc69578-9347-4984-af7d-e05aa9abd29d-kube-api-access-pgh5k\") pod \"bbc69578-9347-4984-af7d-e05aa9abd29d\" (UID: \"bbc69578-9347-4984-af7d-e05aa9abd29d\") " Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.232689 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc69578-9347-4984-af7d-e05aa9abd29d-kube-api-access-pgh5k" (OuterVolumeSpecName: "kube-api-access-pgh5k") pod "bbc69578-9347-4984-af7d-e05aa9abd29d" (UID: "bbc69578-9347-4984-af7d-e05aa9abd29d"). InnerVolumeSpecName "kube-api-access-pgh5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.328336 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgh5k\" (UniqueName: \"kubernetes.io/projected/bbc69578-9347-4984-af7d-e05aa9abd29d-kube-api-access-pgh5k\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.833431 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552528-m6klh" event={"ID":"bbc69578-9347-4984-af7d-e05aa9abd29d","Type":"ContainerDied","Data":"b14abee5ad255bac511a19b8aa3a8bf56980c94b106171824610ceb304dc7148"} Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.833502 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b14abee5ad255bac511a19b8aa3a8bf56980c94b106171824610ceb304dc7148" Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.834641 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552528-m6klh" Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.835027 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:15 crc kubenswrapper[4911]: I0310 14:08:15.840164 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.314282 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wwsm2"] Mar 10 14:08:29 crc kubenswrapper[4911]: E0310 14:08:29.316233 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc69578-9347-4984-af7d-e05aa9abd29d" containerName="oc" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.316321 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc69578-9347-4984-af7d-e05aa9abd29d" containerName="oc" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.316490 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc69578-9347-4984-af7d-e05aa9abd29d" containerName="oc" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.317059 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.337795 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wwsm2"] Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.442582 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.442644 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-bound-sa-token\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.442706 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgc4c\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-kube-api-access-zgc4c\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.442769 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-trusted-ca\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.442812 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.443322 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-registry-tls\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.443650 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.443930 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-registry-certificates\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.478181 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.545344 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-bound-sa-token\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.545445 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgc4c\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-kube-api-access-zgc4c\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.545521 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-trusted-ca\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.545548 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-registry-tls\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.545591 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.545651 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-registry-certificates\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.545696 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.546884 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.548110 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-registry-certificates\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.548437 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-trusted-ca\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.553114 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-registry-tls\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.561984 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.567108 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-bound-sa-token\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.572511 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgc4c\" (UniqueName: \"kubernetes.io/projected/3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e-kube-api-access-zgc4c\") pod \"image-registry-66df7c8f76-wwsm2\" (UID: \"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e\") " pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:29 crc kubenswrapper[4911]: I0310 14:08:29.637254 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:30 crc kubenswrapper[4911]: I0310 14:08:30.054711 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wwsm2"] Mar 10 14:08:30 crc kubenswrapper[4911]: I0310 14:08:30.938809 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" event={"ID":"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e","Type":"ContainerStarted","Data":"bf5fc1cc7f7ea9fa89b6d8728f8111e273fe879bc652fbec4637da916923c876"} Mar 10 14:08:30 crc kubenswrapper[4911]: I0310 14:08:30.939474 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" event={"ID":"3a625b0e-8cf1-4f35-9b77-d9d2aa81d28e","Type":"ContainerStarted","Data":"662bf47416d3b6e7837ff7d6f8db24d0947def61b0f6b0711a9aa4152ec090e4"} Mar 10 14:08:30 crc kubenswrapper[4911]: I0310 14:08:30.939499 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:30 crc kubenswrapper[4911]: I0310 14:08:30.960846 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" podStartSLOduration=1.9608270540000001 podStartE2EDuration="1.960827054s" podCreationTimestamp="2026-03-10 14:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:08:30.95952825 +0000 UTC m=+415.523048197" watchObservedRunningTime="2026-03-10 14:08:30.960827054 +0000 UTC m=+415.524346971" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.020304 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ttx9c"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.024070 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ttx9c" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="registry-server" containerID="cri-o://e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19" gracePeriod=30 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.025779 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w2clr"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.026049 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w2clr" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="registry-server" containerID="cri-o://cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c" gracePeriod=30 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.034835 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rbltz"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.035194 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" containerID="cri-o://a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e" gracePeriod=30 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.046551 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrd7f"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.046900 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vrd7f" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerName="registry-server" containerID="cri-o://d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a" gracePeriod=30 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.053771 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvhkw"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.054571 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.058808 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knb5d"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.059009 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-knb5d" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="registry-server" containerID="cri-o://c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff" gracePeriod=30 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.063244 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvhkw"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.174947 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68abbbcf-c1ce-4be8-9252-9cd985160953-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.175241 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsc4l\" (UniqueName: \"kubernetes.io/projected/68abbbcf-c1ce-4be8-9252-9cd985160953-kube-api-access-vsc4l\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.175470 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68abbbcf-c1ce-4be8-9252-9cd985160953-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.277690 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68abbbcf-c1ce-4be8-9252-9cd985160953-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.277787 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsc4l\" (UniqueName: \"kubernetes.io/projected/68abbbcf-c1ce-4be8-9252-9cd985160953-kube-api-access-vsc4l\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.277840 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68abbbcf-c1ce-4be8-9252-9cd985160953-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.279356 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68abbbcf-c1ce-4be8-9252-9cd985160953-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.285310 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68abbbcf-c1ce-4be8-9252-9cd985160953-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.297553 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsc4l\" (UniqueName: \"kubernetes.io/projected/68abbbcf-c1ce-4be8-9252-9cd985160953-kube-api-access-vsc4l\") pod \"marketplace-operator-79b997595-tvhkw\" (UID: \"68abbbcf-c1ce-4be8-9252-9cd985160953\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: E0310 14:08:31.338954 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19 is running failed: container process not found" containerID="e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 14:08:31 crc kubenswrapper[4911]: E0310 14:08:31.339570 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19 is running failed: container process not found" containerID="e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 14:08:31 crc kubenswrapper[4911]: E0310 14:08:31.339917 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19 is running failed: container process not found" containerID="e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 14:08:31 crc kubenswrapper[4911]: E0310 14:08:31.339950 4911 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-ttx9c" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="registry-server" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.374570 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:31 crc kubenswrapper[4911]: E0310 14:08:31.432396 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c is running failed: container process not found" containerID="cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 14:08:31 crc kubenswrapper[4911]: E0310 14:08:31.433163 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c is running failed: container process not found" containerID="cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 14:08:31 crc kubenswrapper[4911]: E0310 14:08:31.437213 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c is running failed: container process not found" containerID="cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 14:08:31 crc kubenswrapper[4911]: E0310 14:08:31.437262 4911 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-w2clr" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="registry-server" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.613486 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.788895 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-catalog-content\") pod \"e1796b5b-f2e1-4a7a-9463-039bb296626a\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.789289 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wxf2\" (UniqueName: \"kubernetes.io/projected/e1796b5b-f2e1-4a7a-9463-039bb296626a-kube-api-access-7wxf2\") pod \"e1796b5b-f2e1-4a7a-9463-039bb296626a\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.789375 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-utilities\") pod \"e1796b5b-f2e1-4a7a-9463-039bb296626a\" (UID: \"e1796b5b-f2e1-4a7a-9463-039bb296626a\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.790562 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-utilities" (OuterVolumeSpecName: "utilities") pod "e1796b5b-f2e1-4a7a-9463-039bb296626a" (UID: "e1796b5b-f2e1-4a7a-9463-039bb296626a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.795279 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1796b5b-f2e1-4a7a-9463-039bb296626a-kube-api-access-7wxf2" (OuterVolumeSpecName: "kube-api-access-7wxf2") pod "e1796b5b-f2e1-4a7a-9463-039bb296626a" (UID: "e1796b5b-f2e1-4a7a-9463-039bb296626a"). InnerVolumeSpecName "kube-api-access-7wxf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.808252 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.840379 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.853075 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.860341 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.868516 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1796b5b-f2e1-4a7a-9463-039bb296626a" (UID: "e1796b5b-f2e1-4a7a-9463-039bb296626a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.890820 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.890846 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1796b5b-f2e1-4a7a-9463-039bb296626a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.890971 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wxf2\" (UniqueName: \"kubernetes.io/projected/e1796b5b-f2e1-4a7a-9463-039bb296626a-kube-api-access-7wxf2\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.946932 4911 generic.go:334] "Generic (PLEG): container finished" podID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerID="c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff" exitCode=0 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.946992 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knb5d" event={"ID":"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9","Type":"ContainerDied","Data":"c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.947024 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knb5d" event={"ID":"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9","Type":"ContainerDied","Data":"f012d7ee8fc0b3720c3f33c14f0131dab4a817632807c1044cd5cec5c402b45c"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.947042 4911 scope.go:117] "RemoveContainer" containerID="c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.947150 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knb5d" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.950401 4911 generic.go:334] "Generic (PLEG): container finished" podID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerID="a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e" exitCode=0 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.950532 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.950712 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" event={"ID":"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02","Type":"ContainerDied","Data":"a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.950787 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rbltz" event={"ID":"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02","Type":"ContainerDied","Data":"cec8345f9f997a7569abdb07e834fd72d25d07fdc4f23fdb5ee7c20a1ed3f45a"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.954681 4911 generic.go:334] "Generic (PLEG): container finished" podID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerID="cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c" exitCode=0 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.954775 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w2clr" event={"ID":"e1796b5b-f2e1-4a7a-9463-039bb296626a","Type":"ContainerDied","Data":"cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.954810 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w2clr" event={"ID":"e1796b5b-f2e1-4a7a-9463-039bb296626a","Type":"ContainerDied","Data":"ef4db28afd25a9e8cd7c55eabfea72e0aaf4e86a99c96c684826fd1946b5bc00"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.954945 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w2clr" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.962387 4911 generic.go:334] "Generic (PLEG): container finished" podID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerID="e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19" exitCode=0 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.962502 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttx9c" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.962539 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttx9c" event={"ID":"9549430d-b06c-4c28-87bc-6320e73c31e5","Type":"ContainerDied","Data":"e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.962585 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttx9c" event={"ID":"9549430d-b06c-4c28-87bc-6320e73c31e5","Type":"ContainerDied","Data":"6ed2cf93eeb9238c65b001dc3d85d6036dd2b16d91d95e9ce224d23435058959"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.965455 4911 generic.go:334] "Generic (PLEG): container finished" podID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerID="d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a" exitCode=0 Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.965568 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrd7f" event={"ID":"124dcf69-acd6-4d61-ab64-3cf0840df098","Type":"ContainerDied","Data":"d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.965625 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrd7f" event={"ID":"124dcf69-acd6-4d61-ab64-3cf0840df098","Type":"ContainerDied","Data":"090066ce5d6a66c35035dd3841a2af83c46b2c8cc0581e3c4f78e4146b3e2b8d"} Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.965634 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrd7f" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.966868 4911 scope.go:117] "RemoveContainer" containerID="501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.988746 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w2clr"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.991372 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w2clr"] Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.992356 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-operator-metrics\") pod \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.992428 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzwk4\" (UniqueName: \"kubernetes.io/projected/124dcf69-acd6-4d61-ab64-3cf0840df098-kube-api-access-wzwk4\") pod \"124dcf69-acd6-4d61-ab64-3cf0840df098\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.992466 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kddf7\" (UniqueName: \"kubernetes.io/projected/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-kube-api-access-kddf7\") pod \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.992980 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-trusted-ca\") pod \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.993229 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p42rn\" (UniqueName: \"kubernetes.io/projected/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-kube-api-access-p42rn\") pod \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\" (UID: \"3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.993434 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-utilities\") pod \"124dcf69-acd6-4d61-ab64-3cf0840df098\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.993628 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-catalog-content\") pod \"9549430d-b06c-4c28-87bc-6320e73c31e5\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.993891 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-utilities\") pod \"9549430d-b06c-4c28-87bc-6320e73c31e5\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.994077 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-catalog-content\") pod \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.994284 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-utilities\") pod \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\" (UID: \"3ae1adcc-ec03-4cc0-9cff-3ab96da169f9\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.994442 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpcqd\" (UniqueName: \"kubernetes.io/projected/9549430d-b06c-4c28-87bc-6320e73c31e5-kube-api-access-hpcqd\") pod \"9549430d-b06c-4c28-87bc-6320e73c31e5\" (UID: \"9549430d-b06c-4c28-87bc-6320e73c31e5\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.994622 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-catalog-content\") pod \"124dcf69-acd6-4d61-ab64-3cf0840df098\" (UID: \"124dcf69-acd6-4d61-ab64-3cf0840df098\") " Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.993995 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" (UID: "3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.994570 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-utilities" (OuterVolumeSpecName: "utilities") pod "124dcf69-acd6-4d61-ab64-3cf0840df098" (UID: "124dcf69-acd6-4d61-ab64-3cf0840df098"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.996005 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-utilities" (OuterVolumeSpecName: "utilities") pod "9549430d-b06c-4c28-87bc-6320e73c31e5" (UID: "9549430d-b06c-4c28-87bc-6320e73c31e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.997715 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-utilities" (OuterVolumeSpecName: "utilities") pod "3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" (UID: "3ae1adcc-ec03-4cc0-9cff-3ab96da169f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.999336 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-kube-api-access-p42rn" (OuterVolumeSpecName: "kube-api-access-p42rn") pod "3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" (UID: "3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02"). InnerVolumeSpecName "kube-api-access-p42rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:31 crc kubenswrapper[4911]: I0310 14:08:31.998979 4911 scope.go:117] "RemoveContainer" containerID="2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.003988 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9549430d-b06c-4c28-87bc-6320e73c31e5-kube-api-access-hpcqd" (OuterVolumeSpecName: "kube-api-access-hpcqd") pod "9549430d-b06c-4c28-87bc-6320e73c31e5" (UID: "9549430d-b06c-4c28-87bc-6320e73c31e5"). InnerVolumeSpecName "kube-api-access-hpcqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.004846 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-kube-api-access-kddf7" (OuterVolumeSpecName: "kube-api-access-kddf7") pod "3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" (UID: "3ae1adcc-ec03-4cc0-9cff-3ab96da169f9"). InnerVolumeSpecName "kube-api-access-kddf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.015826 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" (UID: "3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.015927 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124dcf69-acd6-4d61-ab64-3cf0840df098-kube-api-access-wzwk4" (OuterVolumeSpecName: "kube-api-access-wzwk4") pod "124dcf69-acd6-4d61-ab64-3cf0840df098" (UID: "124dcf69-acd6-4d61-ab64-3cf0840df098"). InnerVolumeSpecName "kube-api-access-wzwk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.022358 4911 scope.go:117] "RemoveContainer" containerID="c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.023007 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff\": container with ID starting with c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff not found: ID does not exist" containerID="c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.023052 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff"} err="failed to get container status \"c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff\": rpc error: code = NotFound desc = could not find container \"c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff\": container with ID starting with c56193bd4205b77b6c2e8eedaddfd35e637a0be22ff72913a71390d4ce8399ff not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.023083 4911 scope.go:117] "RemoveContainer" containerID="501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.024699 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd\": container with ID starting with 501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd not found: ID does not exist" containerID="501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.024770 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd"} err="failed to get container status \"501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd\": rpc error: code = NotFound desc = could not find container \"501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd\": container with ID starting with 501f252f7539276b8950b0a5be4e1df05de072ad1766e2a5c04a1a3f429af3cd not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.024793 4911 scope.go:117] "RemoveContainer" containerID="2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.028339 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55\": container with ID starting with 2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55 not found: ID does not exist" containerID="2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.028388 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55"} err="failed to get container status \"2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55\": rpc error: code = NotFound desc = could not find container \"2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55\": container with ID starting with 2bb25f5b033fd83eda11887f938b22598be06ea086b844652f351f1cff86ae55 not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.028423 4911 scope.go:117] "RemoveContainer" containerID="a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.042766 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvhkw"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.046424 4911 scope.go:117] "RemoveContainer" containerID="cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.049170 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "124dcf69-acd6-4d61-ab64-3cf0840df098" (UID: "124dcf69-acd6-4d61-ab64-3cf0840df098"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.064531 4911 scope.go:117] "RemoveContainer" containerID="a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.065213 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e\": container with ID starting with a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e not found: ID does not exist" containerID="a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.065263 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e"} err="failed to get container status \"a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e\": rpc error: code = NotFound desc = could not find container \"a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e\": container with ID starting with a0870efda0a28223f0f97df24aa67f3230efe66fbc94eedde27b5f725a35835e not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.065290 4911 scope.go:117] "RemoveContainer" containerID="cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.065916 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b\": container with ID starting with cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b not found: ID does not exist" containerID="cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.065991 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b"} err="failed to get container status \"cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b\": rpc error: code = NotFound desc = could not find container \"cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b\": container with ID starting with cfa202fe25a553a8d8318278a310301f5f1275737a6d42760b1e755a3fc65b8b not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.066029 4911 scope.go:117] "RemoveContainer" containerID="cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.070483 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9549430d-b06c-4c28-87bc-6320e73c31e5" (UID: "9549430d-b06c-4c28-87bc-6320e73c31e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.086020 4911 scope.go:117] "RemoveContainer" containerID="936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100106 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzwk4\" (UniqueName: \"kubernetes.io/projected/124dcf69-acd6-4d61-ab64-3cf0840df098-kube-api-access-wzwk4\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100146 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kddf7\" (UniqueName: \"kubernetes.io/projected/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-kube-api-access-kddf7\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100160 4911 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100174 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p42rn\" (UniqueName: \"kubernetes.io/projected/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-kube-api-access-p42rn\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100188 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100200 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100211 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9549430d-b06c-4c28-87bc-6320e73c31e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100222 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100234 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpcqd\" (UniqueName: \"kubernetes.io/projected/9549430d-b06c-4c28-87bc-6320e73c31e5-kube-api-access-hpcqd\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100245 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124dcf69-acd6-4d61-ab64-3cf0840df098-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.100259 4911 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.115769 4911 scope.go:117] "RemoveContainer" containerID="bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.145271 4911 scope.go:117] "RemoveContainer" containerID="cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.145979 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c\": container with ID starting with cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c not found: ID does not exist" containerID="cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.146045 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c"} err="failed to get container status \"cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c\": rpc error: code = NotFound desc = could not find container \"cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c\": container with ID starting with cc96fc8870e23b6b6fdc57013185b6d957abdd2aa18585648d5e692039e71b8c not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.146086 4911 scope.go:117] "RemoveContainer" containerID="936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.146763 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e\": container with ID starting with 936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e not found: ID does not exist" containerID="936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.146829 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e"} err="failed to get container status \"936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e\": rpc error: code = NotFound desc = could not find container \"936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e\": container with ID starting with 936f48f5d58e4c1f6ea1f8f5074adb9dce1c628873bfca64b9e7e9fea494573e not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.146868 4911 scope.go:117] "RemoveContainer" containerID="bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.147297 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4\": container with ID starting with bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4 not found: ID does not exist" containerID="bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.147331 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4"} err="failed to get container status \"bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4\": rpc error: code = NotFound desc = could not find container \"bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4\": container with ID starting with bf204f62ca750a7137b057e325d29e4fed8cdb846ae0c68b6b2f8e2a08c68ff4 not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.147350 4911 scope.go:117] "RemoveContainer" containerID="e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.169377 4911 scope.go:117] "RemoveContainer" containerID="e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.189940 4911 scope.go:117] "RemoveContainer" containerID="b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.205084 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" path="/var/lib/kubelet/pods/e1796b5b-f2e1-4a7a-9463-039bb296626a/volumes" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.207534 4911 scope.go:117] "RemoveContainer" containerID="e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.211864 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19\": container with ID starting with e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19 not found: ID does not exist" containerID="e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.212244 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19"} err="failed to get container status \"e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19\": rpc error: code = NotFound desc = could not find container \"e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19\": container with ID starting with e46f8498f3a7c92d42518e48b55aa0abc45c024083694495f4952cc81256eb19 not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.212304 4911 scope.go:117] "RemoveContainer" containerID="e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.212834 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157\": container with ID starting with e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157 not found: ID does not exist" containerID="e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.212877 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157"} err="failed to get container status \"e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157\": rpc error: code = NotFound desc = could not find container \"e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157\": container with ID starting with e96895a0aa4430dcbb64124c65442ee166c19b8aa40bbb75ce1b55c7f3751157 not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.212910 4911 scope.go:117] "RemoveContainer" containerID="b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.213424 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb\": container with ID starting with b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb not found: ID does not exist" containerID="b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.213523 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb"} err="failed to get container status \"b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb\": rpc error: code = NotFound desc = could not find container \"b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb\": container with ID starting with b65298cf0c3178c2fa6f4c41379dee889cd66c2e48190cc618e71bc41d4021cb not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.213548 4911 scope.go:117] "RemoveContainer" containerID="d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.227356 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" (UID: "3ae1adcc-ec03-4cc0-9cff-3ab96da169f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.233907 4911 scope.go:117] "RemoveContainer" containerID="e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.258028 4911 scope.go:117] "RemoveContainer" containerID="5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.269117 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rbltz"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.274323 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rbltz"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.285198 4911 scope.go:117] "RemoveContainer" containerID="d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.286009 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a\": container with ID starting with d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a not found: ID does not exist" containerID="d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.286045 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a"} err="failed to get container status \"d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a\": rpc error: code = NotFound desc = could not find container \"d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a\": container with ID starting with d5c81da8ba396542e06c6984b4fd800fec822ab321aefaca46a604f8a7b9248a not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.286084 4911 scope.go:117] "RemoveContainer" containerID="e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.287157 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1\": container with ID starting with e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1 not found: ID does not exist" containerID="e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.287216 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1"} err="failed to get container status \"e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1\": rpc error: code = NotFound desc = could not find container \"e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1\": container with ID starting with e093bcc06f7a7733f0e648e986a57a1eef53eefaebd94f06bbf8e4348ea273f1 not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.287252 4911 scope.go:117] "RemoveContainer" containerID="5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787" Mar 10 14:08:32 crc kubenswrapper[4911]: E0310 14:08:32.290996 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787\": container with ID starting with 5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787 not found: ID does not exist" containerID="5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.291064 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787"} err="failed to get container status \"5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787\": rpc error: code = NotFound desc = could not find container \"5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787\": container with ID starting with 5b2209ea8b8469143128ec0790de9d431c8382ef8e56d31b1fa6027ee1d87787 not found: ID does not exist" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.295331 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ttx9c"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.298957 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ttx9c"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.303148 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.307765 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrd7f"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.312195 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrd7f"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.323794 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knb5d"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.323860 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-knb5d"] Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.994775 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" event={"ID":"68abbbcf-c1ce-4be8-9252-9cd985160953","Type":"ContainerStarted","Data":"05dbe6e95a7665f168249e7238d6ca17f910c259d38593ea59f9e969f4b7e6c8"} Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.995408 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" event={"ID":"68abbbcf-c1ce-4be8-9252-9cd985160953","Type":"ContainerStarted","Data":"f14bc22b234ba409823df3f004a3bde575d4345f953500f994fa673ed6d0e6fb"} Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.996739 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:32 crc kubenswrapper[4911]: I0310 14:08:32.999611 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.020019 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tvhkw" podStartSLOduration=2.01998633 podStartE2EDuration="2.01998633s" podCreationTimestamp="2026-03-10 14:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:08:33.019529088 +0000 UTC m=+417.583049035" watchObservedRunningTime="2026-03-10 14:08:33.01998633 +0000 UTC m=+417.583506257" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.231679 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kgjb4"] Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.231949 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="extract-content" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.231961 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="extract-content" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.231973 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.231980 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.231992 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="extract-utilities" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.231998 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="extract-utilities" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232005 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232011 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232021 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="extract-utilities" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232027 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="extract-utilities" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232034 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="extract-content" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232040 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="extract-content" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232051 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="extract-utilities" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232057 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="extract-utilities" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232107 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerName="extract-utilities" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232115 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerName="extract-utilities" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232124 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232131 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232138 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerName="extract-content" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232165 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerName="extract-content" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232172 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232178 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232186 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232192 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232199 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="extract-content" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232205 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="extract-content" Mar 10 14:08:33 crc kubenswrapper[4911]: E0310 14:08:33.232213 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232236 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232561 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232609 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232619 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" containerName="marketplace-operator" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232674 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232682 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1796b5b-f2e1-4a7a-9463-039bb296626a" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.232693 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" containerName="registry-server" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.235370 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.239001 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.241691 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgjb4"] Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.424143 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-catalog-content\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.424242 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-utilities\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.424331 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv4hw\" (UniqueName: \"kubernetes.io/projected/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-kube-api-access-kv4hw\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.428437 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kx5pg"] Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.429464 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.431814 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.442152 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kx5pg"] Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.525966 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv4hw\" (UniqueName: \"kubernetes.io/projected/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-kube-api-access-kv4hw\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.526027 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db45043e-a5f4-4e42-a74b-a6477031d06d-utilities\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.526071 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp4jm\" (UniqueName: \"kubernetes.io/projected/db45043e-a5f4-4e42-a74b-a6477031d06d-kube-api-access-dp4jm\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.526124 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-catalog-content\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.526198 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-utilities\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.526250 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db45043e-a5f4-4e42-a74b-a6477031d06d-catalog-content\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.526631 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-catalog-content\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.526659 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-utilities\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.558686 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv4hw\" (UniqueName: \"kubernetes.io/projected/07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef-kube-api-access-kv4hw\") pod \"redhat-marketplace-kgjb4\" (UID: \"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef\") " pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.627835 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp4jm\" (UniqueName: \"kubernetes.io/projected/db45043e-a5f4-4e42-a74b-a6477031d06d-kube-api-access-dp4jm\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.627957 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db45043e-a5f4-4e42-a74b-a6477031d06d-catalog-content\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.627990 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db45043e-a5f4-4e42-a74b-a6477031d06d-utilities\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.628516 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db45043e-a5f4-4e42-a74b-a6477031d06d-utilities\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.628658 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db45043e-a5f4-4e42-a74b-a6477031d06d-catalog-content\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.647874 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp4jm\" (UniqueName: \"kubernetes.io/projected/db45043e-a5f4-4e42-a74b-a6477031d06d-kube-api-access-dp4jm\") pod \"redhat-operators-kx5pg\" (UID: \"db45043e-a5f4-4e42-a74b-a6477031d06d\") " pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.750914 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:33 crc kubenswrapper[4911]: I0310 14:08:33.857495 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:34 crc kubenswrapper[4911]: I0310 14:08:34.176373 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kx5pg"] Mar 10 14:08:34 crc kubenswrapper[4911]: W0310 14:08:34.186284 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb45043e_a5f4_4e42_a74b_a6477031d06d.slice/crio-02c8dd574819a0d527a7ef354be6e5e9cc26477f0f6d101b0a97b4d9693a6c6e WatchSource:0}: Error finding container 02c8dd574819a0d527a7ef354be6e5e9cc26477f0f6d101b0a97b4d9693a6c6e: Status 404 returned error can't find the container with id 02c8dd574819a0d527a7ef354be6e5e9cc26477f0f6d101b0a97b4d9693a6c6e Mar 10 14:08:34 crc kubenswrapper[4911]: I0310 14:08:34.201904 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124dcf69-acd6-4d61-ab64-3cf0840df098" path="/var/lib/kubelet/pods/124dcf69-acd6-4d61-ab64-3cf0840df098/volumes" Mar 10 14:08:34 crc kubenswrapper[4911]: I0310 14:08:34.204442 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae1adcc-ec03-4cc0-9cff-3ab96da169f9" path="/var/lib/kubelet/pods/3ae1adcc-ec03-4cc0-9cff-3ab96da169f9/volumes" Mar 10 14:08:34 crc kubenswrapper[4911]: I0310 14:08:34.206193 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02" path="/var/lib/kubelet/pods/3e3bb0fc-9a7a-4008-9d8d-c84e19b65c02/volumes" Mar 10 14:08:34 crc kubenswrapper[4911]: I0310 14:08:34.208006 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9549430d-b06c-4c28-87bc-6320e73c31e5" path="/var/lib/kubelet/pods/9549430d-b06c-4c28-87bc-6320e73c31e5/volumes" Mar 10 14:08:34 crc kubenswrapper[4911]: I0310 14:08:34.275323 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgjb4"] Mar 10 14:08:34 crc kubenswrapper[4911]: W0310 14:08:34.279745 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07da101e_bfe5_4c96_b1ed_9a9b4bc7e6ef.slice/crio-4819c39bf2dfc32d975aee35ff2e9cf0e61689d818a9fc1acae6cfb61859f1a8 WatchSource:0}: Error finding container 4819c39bf2dfc32d975aee35ff2e9cf0e61689d818a9fc1acae6cfb61859f1a8: Status 404 returned error can't find the container with id 4819c39bf2dfc32d975aee35ff2e9cf0e61689d818a9fc1acae6cfb61859f1a8 Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.037151 4911 generic.go:334] "Generic (PLEG): container finished" podID="07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef" containerID="f07949cd3cfcf0a8de9ee1035ab29fb9b03128c6f857dceb4536f5261e240c5f" exitCode=0 Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.037235 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgjb4" event={"ID":"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef","Type":"ContainerDied","Data":"f07949cd3cfcf0a8de9ee1035ab29fb9b03128c6f857dceb4536f5261e240c5f"} Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.037868 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgjb4" event={"ID":"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef","Type":"ContainerStarted","Data":"4819c39bf2dfc32d975aee35ff2e9cf0e61689d818a9fc1acae6cfb61859f1a8"} Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.042150 4911 generic.go:334] "Generic (PLEG): container finished" podID="db45043e-a5f4-4e42-a74b-a6477031d06d" containerID="8fb88fa0f8730681a4509533b901349c6f596b34b83780c7cf770723bb5ae0f6" exitCode=0 Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.042258 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx5pg" event={"ID":"db45043e-a5f4-4e42-a74b-a6477031d06d","Type":"ContainerDied","Data":"8fb88fa0f8730681a4509533b901349c6f596b34b83780c7cf770723bb5ae0f6"} Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.042367 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx5pg" event={"ID":"db45043e-a5f4-4e42-a74b-a6477031d06d","Type":"ContainerStarted","Data":"02c8dd574819a0d527a7ef354be6e5e9cc26477f0f6d101b0a97b4d9693a6c6e"} Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.627955 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lrkfz"] Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.630066 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.634128 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.639251 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lrkfz"] Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.672494 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c0803f-7b0b-48e8-b19d-d81138d5fc10-catalog-content\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.672563 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c0803f-7b0b-48e8-b19d-d81138d5fc10-utilities\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.672625 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqln8\" (UniqueName: \"kubernetes.io/projected/94c0803f-7b0b-48e8-b19d-d81138d5fc10-kube-api-access-lqln8\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.774614 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqln8\" (UniqueName: \"kubernetes.io/projected/94c0803f-7b0b-48e8-b19d-d81138d5fc10-kube-api-access-lqln8\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.774753 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c0803f-7b0b-48e8-b19d-d81138d5fc10-catalog-content\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.774797 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c0803f-7b0b-48e8-b19d-d81138d5fc10-utilities\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.775401 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c0803f-7b0b-48e8-b19d-d81138d5fc10-catalog-content\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.775915 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c0803f-7b0b-48e8-b19d-d81138d5fc10-utilities\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.803008 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqln8\" (UniqueName: \"kubernetes.io/projected/94c0803f-7b0b-48e8-b19d-d81138d5fc10-kube-api-access-lqln8\") pod \"certified-operators-lrkfz\" (UID: \"94c0803f-7b0b-48e8-b19d-d81138d5fc10\") " pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.825408 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dfhbk"] Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.827484 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.830847 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.849868 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfhbk"] Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.876510 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1081e8d1-8f67-41ea-8fbb-a418473c68ca-utilities\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.876591 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1081e8d1-8f67-41ea-8fbb-a418473c68ca-catalog-content\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.876625 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgsns\" (UniqueName: \"kubernetes.io/projected/1081e8d1-8f67-41ea-8fbb-a418473c68ca-kube-api-access-zgsns\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.950159 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.977796 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1081e8d1-8f67-41ea-8fbb-a418473c68ca-catalog-content\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.978280 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgsns\" (UniqueName: \"kubernetes.io/projected/1081e8d1-8f67-41ea-8fbb-a418473c68ca-kube-api-access-zgsns\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.978351 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1081e8d1-8f67-41ea-8fbb-a418473c68ca-utilities\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.978868 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1081e8d1-8f67-41ea-8fbb-a418473c68ca-catalog-content\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:35 crc kubenswrapper[4911]: I0310 14:08:35.978907 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1081e8d1-8f67-41ea-8fbb-a418473c68ca-utilities\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:36 crc kubenswrapper[4911]: I0310 14:08:36.012026 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgsns\" (UniqueName: \"kubernetes.io/projected/1081e8d1-8f67-41ea-8fbb-a418473c68ca-kube-api-access-zgsns\") pod \"community-operators-dfhbk\" (UID: \"1081e8d1-8f67-41ea-8fbb-a418473c68ca\") " pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:36 crc kubenswrapper[4911]: I0310 14:08:36.153580 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 14:08:36 crc kubenswrapper[4911]: I0310 14:08:36.162088 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:36 crc kubenswrapper[4911]: I0310 14:08:36.396207 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lrkfz"] Mar 10 14:08:36 crc kubenswrapper[4911]: W0310 14:08:36.425033 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94c0803f_7b0b_48e8_b19d_d81138d5fc10.slice/crio-5ecc23210ec8415309edda4291e68e64bd49d18d866aa1317e1ab72b1d401d94 WatchSource:0}: Error finding container 5ecc23210ec8415309edda4291e68e64bd49d18d866aa1317e1ab72b1d401d94: Status 404 returned error can't find the container with id 5ecc23210ec8415309edda4291e68e64bd49d18d866aa1317e1ab72b1d401d94 Mar 10 14:08:36 crc kubenswrapper[4911]: I0310 14:08:36.613319 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfhbk"] Mar 10 14:08:36 crc kubenswrapper[4911]: W0310 14:08:36.652588 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1081e8d1_8f67_41ea_8fbb_a418473c68ca.slice/crio-501765551a964ec475998c6422c6c7edbcb6c6fe085c626b689a84b766a8e835 WatchSource:0}: Error finding container 501765551a964ec475998c6422c6c7edbcb6c6fe085c626b689a84b766a8e835: Status 404 returned error can't find the container with id 501765551a964ec475998c6422c6c7edbcb6c6fe085c626b689a84b766a8e835 Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.073050 4911 generic.go:334] "Generic (PLEG): container finished" podID="db45043e-a5f4-4e42-a74b-a6477031d06d" containerID="4a190b6f263a1d9856496fcf5c90c08ee2a18995b410167d3ebb78bde2f6636d" exitCode=0 Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.073172 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx5pg" event={"ID":"db45043e-a5f4-4e42-a74b-a6477031d06d","Type":"ContainerDied","Data":"4a190b6f263a1d9856496fcf5c90c08ee2a18995b410167d3ebb78bde2f6636d"} Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.082286 4911 generic.go:334] "Generic (PLEG): container finished" podID="94c0803f-7b0b-48e8-b19d-d81138d5fc10" containerID="1665276d106cc8ab82c4da08969cc9446c4257f8e5551ce9ced49c41b14c9b73" exitCode=0 Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.082480 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrkfz" event={"ID":"94c0803f-7b0b-48e8-b19d-d81138d5fc10","Type":"ContainerDied","Data":"1665276d106cc8ab82c4da08969cc9446c4257f8e5551ce9ced49c41b14c9b73"} Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.082561 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrkfz" event={"ID":"94c0803f-7b0b-48e8-b19d-d81138d5fc10","Type":"ContainerStarted","Data":"5ecc23210ec8415309edda4291e68e64bd49d18d866aa1317e1ab72b1d401d94"} Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.091596 4911 generic.go:334] "Generic (PLEG): container finished" podID="1081e8d1-8f67-41ea-8fbb-a418473c68ca" containerID="058a9ff4fd37231b1ba7433320ef300ac9630eb2f8e4017c898c9653e0a94ba0" exitCode=0 Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.091737 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhbk" event={"ID":"1081e8d1-8f67-41ea-8fbb-a418473c68ca","Type":"ContainerDied","Data":"058a9ff4fd37231b1ba7433320ef300ac9630eb2f8e4017c898c9653e0a94ba0"} Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.091776 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhbk" event={"ID":"1081e8d1-8f67-41ea-8fbb-a418473c68ca","Type":"ContainerStarted","Data":"501765551a964ec475998c6422c6c7edbcb6c6fe085c626b689a84b766a8e835"} Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.109005 4911 generic.go:334] "Generic (PLEG): container finished" podID="07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef" containerID="8f82dc0d57f7539d7113ef9d7683c0ace90ede7738d74e5c1d5a254e00d68f80" exitCode=0 Mar 10 14:08:37 crc kubenswrapper[4911]: I0310 14:08:37.109088 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgjb4" event={"ID":"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef","Type":"ContainerDied","Data":"8f82dc0d57f7539d7113ef9d7683c0ace90ede7738d74e5c1d5a254e00d68f80"} Mar 10 14:08:38 crc kubenswrapper[4911]: I0310 14:08:38.124829 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx5pg" event={"ID":"db45043e-a5f4-4e42-a74b-a6477031d06d","Type":"ContainerStarted","Data":"e35f4c4128534b564460bbbf8cc274fc6471fc9369c1a51f22fb1311ce0393b9"} Mar 10 14:08:38 crc kubenswrapper[4911]: I0310 14:08:38.136620 4911 generic.go:334] "Generic (PLEG): container finished" podID="1081e8d1-8f67-41ea-8fbb-a418473c68ca" containerID="9a18db50926e7b715658783b89a6fa638bd02077eb44d9e2e63b1b2aecbf14b5" exitCode=0 Mar 10 14:08:38 crc kubenswrapper[4911]: I0310 14:08:38.136714 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhbk" event={"ID":"1081e8d1-8f67-41ea-8fbb-a418473c68ca","Type":"ContainerDied","Data":"9a18db50926e7b715658783b89a6fa638bd02077eb44d9e2e63b1b2aecbf14b5"} Mar 10 14:08:38 crc kubenswrapper[4911]: I0310 14:08:38.143629 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kx5pg" podStartSLOduration=2.6355099109999998 podStartE2EDuration="5.143603276s" podCreationTimestamp="2026-03-10 14:08:33 +0000 UTC" firstStartedPulling="2026-03-10 14:08:35.04497161 +0000 UTC m=+419.608491527" lastFinishedPulling="2026-03-10 14:08:37.553064955 +0000 UTC m=+422.116584892" observedRunningTime="2026-03-10 14:08:38.142305442 +0000 UTC m=+422.705825359" watchObservedRunningTime="2026-03-10 14:08:38.143603276 +0000 UTC m=+422.707123193" Mar 10 14:08:38 crc kubenswrapper[4911]: I0310 14:08:38.148537 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgjb4" event={"ID":"07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef","Type":"ContainerStarted","Data":"609b97dfcb9460c382fec31345a135b2d62ce3046296310b30bdcb7b77e32854"} Mar 10 14:08:38 crc kubenswrapper[4911]: I0310 14:08:38.209113 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kgjb4" podStartSLOduration=2.728691477 podStartE2EDuration="5.209084446s" podCreationTimestamp="2026-03-10 14:08:33 +0000 UTC" firstStartedPulling="2026-03-10 14:08:35.041998554 +0000 UTC m=+419.605518471" lastFinishedPulling="2026-03-10 14:08:37.522391513 +0000 UTC m=+422.085911440" observedRunningTime="2026-03-10 14:08:38.201376945 +0000 UTC m=+422.764896872" watchObservedRunningTime="2026-03-10 14:08:38.209084446 +0000 UTC m=+422.772604363" Mar 10 14:08:39 crc kubenswrapper[4911]: I0310 14:08:39.157136 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhbk" event={"ID":"1081e8d1-8f67-41ea-8fbb-a418473c68ca","Type":"ContainerStarted","Data":"4caa200761794399c4a762d7c02c0b804aba256f5579dc8f1b0df66cc4626889"} Mar 10 14:08:39 crc kubenswrapper[4911]: I0310 14:08:39.160099 4911 generic.go:334] "Generic (PLEG): container finished" podID="94c0803f-7b0b-48e8-b19d-d81138d5fc10" containerID="664e3aba83a728754946a6839e0d0a0905d8042d600b0056efc58caf7d0ee2c6" exitCode=0 Mar 10 14:08:39 crc kubenswrapper[4911]: I0310 14:08:39.160932 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrkfz" event={"ID":"94c0803f-7b0b-48e8-b19d-d81138d5fc10","Type":"ContainerDied","Data":"664e3aba83a728754946a6839e0d0a0905d8042d600b0056efc58caf7d0ee2c6"} Mar 10 14:08:39 crc kubenswrapper[4911]: I0310 14:08:39.210325 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dfhbk" podStartSLOduration=2.705310894 podStartE2EDuration="4.21030235s" podCreationTimestamp="2026-03-10 14:08:35 +0000 UTC" firstStartedPulling="2026-03-10 14:08:37.095963534 +0000 UTC m=+421.659483451" lastFinishedPulling="2026-03-10 14:08:38.60095499 +0000 UTC m=+423.164474907" observedRunningTime="2026-03-10 14:08:39.186229653 +0000 UTC m=+423.749749590" watchObservedRunningTime="2026-03-10 14:08:39.21030235 +0000 UTC m=+423.773822267" Mar 10 14:08:40 crc kubenswrapper[4911]: I0310 14:08:40.169563 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrkfz" event={"ID":"94c0803f-7b0b-48e8-b19d-d81138d5fc10","Type":"ContainerStarted","Data":"5537000ae840fa5e63d4a70a199494b30e4d1ea5d36a3f66ce946fa320fdcf70"} Mar 10 14:08:40 crc kubenswrapper[4911]: I0310 14:08:40.196852 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lrkfz" podStartSLOduration=2.584445593 podStartE2EDuration="5.19683142s" podCreationTimestamp="2026-03-10 14:08:35 +0000 UTC" firstStartedPulling="2026-03-10 14:08:37.087386511 +0000 UTC m=+421.650906428" lastFinishedPulling="2026-03-10 14:08:39.699772338 +0000 UTC m=+424.263292255" observedRunningTime="2026-03-10 14:08:40.193002371 +0000 UTC m=+424.756522288" watchObservedRunningTime="2026-03-10 14:08:40.19683142 +0000 UTC m=+424.760351347" Mar 10 14:08:43 crc kubenswrapper[4911]: I0310 14:08:43.752388 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:43 crc kubenswrapper[4911]: I0310 14:08:43.753070 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:43 crc kubenswrapper[4911]: I0310 14:08:43.859846 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:43 crc kubenswrapper[4911]: I0310 14:08:43.859920 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:43 crc kubenswrapper[4911]: I0310 14:08:43.922286 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:44 crc kubenswrapper[4911]: I0310 14:08:44.257317 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kgjb4" Mar 10 14:08:44 crc kubenswrapper[4911]: I0310 14:08:44.802870 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kx5pg" podUID="db45043e-a5f4-4e42-a74b-a6477031d06d" containerName="registry-server" probeResult="failure" output=< Mar 10 14:08:44 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 14:08:44 crc kubenswrapper[4911]: > Mar 10 14:08:45 crc kubenswrapper[4911]: I0310 14:08:45.951058 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:45 crc kubenswrapper[4911]: I0310 14:08:45.951855 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:46 crc kubenswrapper[4911]: I0310 14:08:46.024228 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:46 crc kubenswrapper[4911]: I0310 14:08:46.162598 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:46 crc kubenswrapper[4911]: I0310 14:08:46.162679 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:46 crc kubenswrapper[4911]: I0310 14:08:46.229476 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:46 crc kubenswrapper[4911]: I0310 14:08:46.272712 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lrkfz" Mar 10 14:08:46 crc kubenswrapper[4911]: I0310 14:08:46.292877 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dfhbk" Mar 10 14:08:48 crc kubenswrapper[4911]: I0310 14:08:48.521057 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:08:48 crc kubenswrapper[4911]: I0310 14:08:48.521502 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:08:49 crc kubenswrapper[4911]: I0310 14:08:49.645549 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wwsm2" Mar 10 14:08:49 crc kubenswrapper[4911]: I0310 14:08:49.713536 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m5pk6"] Mar 10 14:08:53 crc kubenswrapper[4911]: I0310 14:08:53.806780 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:08:53 crc kubenswrapper[4911]: I0310 14:08:53.851221 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kx5pg" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.342045 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69"] Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.343147 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" podUID="bb92791b-a628-41ac-ad29-7fb59c0c35f8" containerName="route-controller-manager" containerID="cri-o://2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a" gracePeriod=30 Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.714793 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.776355 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9rcv\" (UniqueName: \"kubernetes.io/projected/bb92791b-a628-41ac-ad29-7fb59c0c35f8-kube-api-access-z9rcv\") pod \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.776454 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-config\") pod \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.776525 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-client-ca\") pod \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.776581 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb92791b-a628-41ac-ad29-7fb59c0c35f8-serving-cert\") pod \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\" (UID: \"bb92791b-a628-41ac-ad29-7fb59c0c35f8\") " Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.778248 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "bb92791b-a628-41ac-ad29-7fb59c0c35f8" (UID: "bb92791b-a628-41ac-ad29-7fb59c0c35f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.778298 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-config" (OuterVolumeSpecName: "config") pod "bb92791b-a628-41ac-ad29-7fb59c0c35f8" (UID: "bb92791b-a628-41ac-ad29-7fb59c0c35f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.786255 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb92791b-a628-41ac-ad29-7fb59c0c35f8-kube-api-access-z9rcv" (OuterVolumeSpecName: "kube-api-access-z9rcv") pod "bb92791b-a628-41ac-ad29-7fb59c0c35f8" (UID: "bb92791b-a628-41ac-ad29-7fb59c0c35f8"). InnerVolumeSpecName "kube-api-access-z9rcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.786966 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb92791b-a628-41ac-ad29-7fb59c0c35f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bb92791b-a628-41ac-ad29-7fb59c0c35f8" (UID: "bb92791b-a628-41ac-ad29-7fb59c0c35f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.878771 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.878815 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb92791b-a628-41ac-ad29-7fb59c0c35f8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.878829 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb92791b-a628-41ac-ad29-7fb59c0c35f8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:00 crc kubenswrapper[4911]: I0310 14:09:00.878840 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9rcv\" (UniqueName: \"kubernetes.io/projected/bb92791b-a628-41ac-ad29-7fb59c0c35f8-kube-api-access-z9rcv\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.285055 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.285154 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.286481 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.288866 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.313299 4911 generic.go:334] "Generic (PLEG): container finished" podID="bb92791b-a628-41ac-ad29-7fb59c0c35f8" containerID="2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a" exitCode=0 Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.313352 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" event={"ID":"bb92791b-a628-41ac-ad29-7fb59c0c35f8","Type":"ContainerDied","Data":"2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a"} Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.313396 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" event={"ID":"bb92791b-a628-41ac-ad29-7fb59c0c35f8","Type":"ContainerDied","Data":"11a549637352921de486d556e6f3864068994b78b9c81392a345bcb8768b6651"} Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.313379 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.313419 4911 scope.go:117] "RemoveContainer" containerID="2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.340758 4911 scope.go:117] "RemoveContainer" containerID="2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a" Mar 10 14:09:01 crc kubenswrapper[4911]: E0310 14:09:01.341538 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a\": container with ID starting with 2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a not found: ID does not exist" containerID="2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.341601 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a"} err="failed to get container status \"2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a\": rpc error: code = NotFound desc = could not find container \"2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a\": container with ID starting with 2d5e8401adff1c32d86ec3e9c688b930bb2f77260c3647f7848569a816dbeb7a not found: ID does not exist" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.364976 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69"] Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.366165 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-rwv69"] Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.489349 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.493418 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.493806 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a44efc-20ad-4c01-9606-e6fdb5e0c721-metrics-certs\") pod \"network-metrics-daemon-r28f8\" (UID: \"d7a44efc-20ad-4c01-9606-e6fdb5e0c721\") " pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.698956 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.704929 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r28f8" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.859355 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49"] Mar 10 14:09:01 crc kubenswrapper[4911]: E0310 14:09:01.860110 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb92791b-a628-41ac-ad29-7fb59c0c35f8" containerName="route-controller-manager" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.860124 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb92791b-a628-41ac-ad29-7fb59c0c35f8" containerName="route-controller-manager" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.860225 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb92791b-a628-41ac-ad29-7fb59c0c35f8" containerName="route-controller-manager" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.860715 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.864116 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.864619 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.865974 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.866373 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.866715 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.868531 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.875286 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49"] Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.894711 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zp8\" (UniqueName: \"kubernetes.io/projected/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-kube-api-access-d4zp8\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.894863 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-client-ca\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.894925 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-serving-cert\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.894986 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-config\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: W0310 14:09:01.989400 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-646301b426d4002850554b98414d8bc7082623b99ddc13fa405a8f18b31bd1fd WatchSource:0}: Error finding container 646301b426d4002850554b98414d8bc7082623b99ddc13fa405a8f18b31bd1fd: Status 404 returned error can't find the container with id 646301b426d4002850554b98414d8bc7082623b99ddc13fa405a8f18b31bd1fd Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.996306 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-client-ca\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.996383 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-serving-cert\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.996436 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-config\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.996477 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zp8\" (UniqueName: \"kubernetes.io/projected/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-kube-api-access-d4zp8\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.997446 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-client-ca\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:01 crc kubenswrapper[4911]: I0310 14:09:01.997763 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-config\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.000698 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-serving-cert\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.019972 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zp8\" (UniqueName: \"kubernetes.io/projected/4c1d4e0f-56ef-49e5-a88c-6895cf7164b9-kube-api-access-d4zp8\") pod \"route-controller-manager-8448895d96-5ng49\" (UID: \"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9\") " pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.160830 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r28f8"] Mar 10 14:09:02 crc kubenswrapper[4911]: W0310 14:09:02.165108 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a44efc_20ad_4c01_9606_e6fdb5e0c721.slice/crio-dd61bc8409d7ba41beba15f2e40df467a2a7c3961f2019b6d22c0fa69c6fe6e3 WatchSource:0}: Error finding container dd61bc8409d7ba41beba15f2e40df467a2a7c3961f2019b6d22c0fa69c6fe6e3: Status 404 returned error can't find the container with id dd61bc8409d7ba41beba15f2e40df467a2a7c3961f2019b6d22c0fa69c6fe6e3 Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.187305 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.199678 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb92791b-a628-41ac-ad29-7fb59c0c35f8" path="/var/lib/kubelet/pods/bb92791b-a628-41ac-ad29-7fb59c0c35f8/volumes" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.300717 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.301199 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.310114 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.310447 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.332655 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r28f8" event={"ID":"d7a44efc-20ad-4c01-9606-e6fdb5e0c721","Type":"ContainerStarted","Data":"dd61bc8409d7ba41beba15f2e40df467a2a7c3961f2019b6d22c0fa69c6fe6e3"} Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.333796 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1e260fdb59a080548936d5ae9d1f726b67dffc529de379863d98fdce6f7eca1d"} Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.333820 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"646301b426d4002850554b98414d8bc7082623b99ddc13fa405a8f18b31bd1fd"} Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.497228 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.497343 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 14:09:02 crc kubenswrapper[4911]: I0310 14:09:02.645094 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49"] Mar 10 14:09:03 crc kubenswrapper[4911]: W0310 14:09:02.881919 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-45f9eb1f44a48c69611887647739109921b3122eaf86e7fef888077675e7b6e0 WatchSource:0}: Error finding container 45f9eb1f44a48c69611887647739109921b3122eaf86e7fef888077675e7b6e0: Status 404 returned error can't find the container with id 45f9eb1f44a48c69611887647739109921b3122eaf86e7fef888077675e7b6e0 Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.351042 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r28f8" event={"ID":"d7a44efc-20ad-4c01-9606-e6fdb5e0c721","Type":"ContainerStarted","Data":"8bc8d689e1f513c5c48505db22cdb3ac52ac7a329bfbd5370afb814c2e8a7951"} Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.351451 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r28f8" event={"ID":"d7a44efc-20ad-4c01-9606-e6fdb5e0c721","Type":"ContainerStarted","Data":"1539bbbb587c0100525d65935343863c4c26278a2fc0cf5b1e09972ca2cb132d"} Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.354377 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1384f6f698258643a7d4dc92de53866fdc26611a5e3d442a0f346620b360468c"} Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.354396 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"45f9eb1f44a48c69611887647739109921b3122eaf86e7fef888077675e7b6e0"} Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.356713 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"412b94ed6a54a6d27855b0dc5829f2b5dde93717bbeb366ccb9e6ef048cd55a6"} Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.358363 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" event={"ID":"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9","Type":"ContainerStarted","Data":"9dd7900ae357a9a6aa1935f7d0954806a36cbb79ff75012e28ef40e24e0bb031"} Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.358387 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" event={"ID":"4c1d4e0f-56ef-49e5-a88c-6895cf7164b9","Type":"ContainerStarted","Data":"efd7cff8e3d89ac3048854410ba55dcf983dbbe1f2a567d47750385f0b5b38e3"} Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.358663 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.371539 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.375236 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-r28f8" podStartSLOduration=412.375219282 podStartE2EDuration="6m52.375219282s" podCreationTimestamp="2026-03-10 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:09:03.369008485 +0000 UTC m=+447.932528392" watchObservedRunningTime="2026-03-10 14:09:03.375219282 +0000 UTC m=+447.938739199" Mar 10 14:09:03 crc kubenswrapper[4911]: I0310 14:09:03.417883 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8448895d96-5ng49" podStartSLOduration=3.417859955 podStartE2EDuration="3.417859955s" podCreationTimestamp="2026-03-10 14:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:09:03.414672837 +0000 UTC m=+447.978192754" watchObservedRunningTime="2026-03-10 14:09:03.417859955 +0000 UTC m=+447.981379872" Mar 10 14:09:05 crc kubenswrapper[4911]: I0310 14:09:05.374616 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0dcb8333981f5ca60c57ce5272b8d76f27e39df4092441a4934285476991cd05"} Mar 10 14:09:05 crc kubenswrapper[4911]: I0310 14:09:05.375552 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:09:14 crc kubenswrapper[4911]: I0310 14:09:14.765156 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" podUID="f89e6f0d-a78a-4543-9b03-ad1245748d9a" containerName="registry" containerID="cri-o://231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a" gracePeriod=30 Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.145137 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.218900 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-bound-sa-token\") pod \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.218986 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-trusted-ca\") pod \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.219035 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-tls\") pod \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.219083 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-certificates\") pod \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.219133 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f89e6f0d-a78a-4543-9b03-ad1245748d9a-installation-pull-secrets\") pod \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.219222 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f89e6f0d-a78a-4543-9b03-ad1245748d9a-ca-trust-extracted\") pod \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.219271 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pj9t\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-kube-api-access-7pj9t\") pod \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.219640 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\" (UID: \"f89e6f0d-a78a-4543-9b03-ad1245748d9a\") " Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.220449 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f89e6f0d-a78a-4543-9b03-ad1245748d9a" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.220545 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f89e6f0d-a78a-4543-9b03-ad1245748d9a" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.228102 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f89e6f0d-a78a-4543-9b03-ad1245748d9a" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.228925 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89e6f0d-a78a-4543-9b03-ad1245748d9a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f89e6f0d-a78a-4543-9b03-ad1245748d9a" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.230133 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-kube-api-access-7pj9t" (OuterVolumeSpecName: "kube-api-access-7pj9t") pod "f89e6f0d-a78a-4543-9b03-ad1245748d9a" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a"). InnerVolumeSpecName "kube-api-access-7pj9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.230989 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f89e6f0d-a78a-4543-9b03-ad1245748d9a" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.241090 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f89e6f0d-a78a-4543-9b03-ad1245748d9a" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.258072 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89e6f0d-a78a-4543-9b03-ad1245748d9a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f89e6f0d-a78a-4543-9b03-ad1245748d9a" (UID: "f89e6f0d-a78a-4543-9b03-ad1245748d9a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.321323 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.321366 4911 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.321381 4911 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f89e6f0d-a78a-4543-9b03-ad1245748d9a-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.321415 4911 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f89e6f0d-a78a-4543-9b03-ad1245748d9a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.321427 4911 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f89e6f0d-a78a-4543-9b03-ad1245748d9a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.321441 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pj9t\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-kube-api-access-7pj9t\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.321457 4911 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89e6f0d-a78a-4543-9b03-ad1245748d9a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.440713 4911 generic.go:334] "Generic (PLEG): container finished" podID="f89e6f0d-a78a-4543-9b03-ad1245748d9a" containerID="231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a" exitCode=0 Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.440810 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" event={"ID":"f89e6f0d-a78a-4543-9b03-ad1245748d9a","Type":"ContainerDied","Data":"231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a"} Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.441077 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" event={"ID":"f89e6f0d-a78a-4543-9b03-ad1245748d9a","Type":"ContainerDied","Data":"b1173d6c86d458decd5ed2b4e4160f6cabc5ee7814ad80dd6a02787e9970ea79"} Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.440850 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m5pk6" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.441102 4911 scope.go:117] "RemoveContainer" containerID="231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.472182 4911 scope.go:117] "RemoveContainer" containerID="231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a" Mar 10 14:09:15 crc kubenswrapper[4911]: E0310 14:09:15.475494 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a\": container with ID starting with 231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a not found: ID does not exist" containerID="231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.475567 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a"} err="failed to get container status \"231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a\": rpc error: code = NotFound desc = could not find container \"231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a\": container with ID starting with 231c3aa1a245c0648a9e407b39a5ba3e1e2621ef2b486937cb6db13915fcdd9a not found: ID does not exist" Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.494262 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m5pk6"] Mar 10 14:09:15 crc kubenswrapper[4911]: I0310 14:09:15.501580 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m5pk6"] Mar 10 14:09:16 crc kubenswrapper[4911]: I0310 14:09:16.213309 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89e6f0d-a78a-4543-9b03-ad1245748d9a" path="/var/lib/kubelet/pods/f89e6f0d-a78a-4543-9b03-ad1245748d9a/volumes" Mar 10 14:09:18 crc kubenswrapper[4911]: I0310 14:09:18.520471 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:09:18 crc kubenswrapper[4911]: I0310 14:09:18.521124 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:09:42 crc kubenswrapper[4911]: I0310 14:09:42.502519 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.520500 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.520884 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.520951 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.521766 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0bb21d0d9028cc517f2b21aadcd53128eee7aee107df38fbfad5959639c7f688"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.521850 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://0bb21d0d9028cc517f2b21aadcd53128eee7aee107df38fbfad5959639c7f688" gracePeriod=600 Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.879777 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="0bb21d0d9028cc517f2b21aadcd53128eee7aee107df38fbfad5959639c7f688" exitCode=0 Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.879882 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"0bb21d0d9028cc517f2b21aadcd53128eee7aee107df38fbfad5959639c7f688"} Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.880123 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"87a8887e64f64f70cc9bb63b42cf72e1e862391a02a6945fc6c8c609082e2a34"} Mar 10 14:09:48 crc kubenswrapper[4911]: I0310 14:09:48.880337 4911 scope.go:117] "RemoveContainer" containerID="7677dc6d0537adf2201f7090ab7cabd69f10e337b220689c09aeddc696338950" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.147025 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552530-kshjj"] Mar 10 14:10:00 crc kubenswrapper[4911]: E0310 14:10:00.147807 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89e6f0d-a78a-4543-9b03-ad1245748d9a" containerName="registry" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.147822 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89e6f0d-a78a-4543-9b03-ad1245748d9a" containerName="registry" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.147966 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89e6f0d-a78a-4543-9b03-ad1245748d9a" containerName="registry" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.148530 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552530-kshjj" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.152640 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.152906 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.153076 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.159697 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552530-kshjj"] Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.326078 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cfmm\" (UniqueName: \"kubernetes.io/projected/d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71-kube-api-access-8cfmm\") pod \"auto-csr-approver-29552530-kshjj\" (UID: \"d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71\") " pod="openshift-infra/auto-csr-approver-29552530-kshjj" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.428443 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cfmm\" (UniqueName: \"kubernetes.io/projected/d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71-kube-api-access-8cfmm\") pod \"auto-csr-approver-29552530-kshjj\" (UID: \"d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71\") " pod="openshift-infra/auto-csr-approver-29552530-kshjj" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.455971 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cfmm\" (UniqueName: \"kubernetes.io/projected/d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71-kube-api-access-8cfmm\") pod \"auto-csr-approver-29552530-kshjj\" (UID: \"d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71\") " pod="openshift-infra/auto-csr-approver-29552530-kshjj" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.474185 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552530-kshjj" Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.677330 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552530-kshjj"] Mar 10 14:10:00 crc kubenswrapper[4911]: I0310 14:10:00.974772 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552530-kshjj" event={"ID":"d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71","Type":"ContainerStarted","Data":"83ed9432154e9ef4f4ace3f20977d32c6d6964a66811da07797e2525847c49e3"} Mar 10 14:10:02 crc kubenswrapper[4911]: I0310 14:10:02.991455 4911 generic.go:334] "Generic (PLEG): container finished" podID="d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71" containerID="33850a67294f29efe2211f3dd2ca090711bb1173a67d98d810cbf078e1695130" exitCode=0 Mar 10 14:10:02 crc kubenswrapper[4911]: I0310 14:10:02.991586 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552530-kshjj" event={"ID":"d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71","Type":"ContainerDied","Data":"33850a67294f29efe2211f3dd2ca090711bb1173a67d98d810cbf078e1695130"} Mar 10 14:10:04 crc kubenswrapper[4911]: I0310 14:10:04.355679 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552530-kshjj" Mar 10 14:10:04 crc kubenswrapper[4911]: I0310 14:10:04.488044 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cfmm\" (UniqueName: \"kubernetes.io/projected/d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71-kube-api-access-8cfmm\") pod \"d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71\" (UID: \"d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71\") " Mar 10 14:10:04 crc kubenswrapper[4911]: I0310 14:10:04.495203 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71-kube-api-access-8cfmm" (OuterVolumeSpecName: "kube-api-access-8cfmm") pod "d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71" (UID: "d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71"). InnerVolumeSpecName "kube-api-access-8cfmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:10:04 crc kubenswrapper[4911]: I0310 14:10:04.590054 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cfmm\" (UniqueName: \"kubernetes.io/projected/d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71-kube-api-access-8cfmm\") on node \"crc\" DevicePath \"\"" Mar 10 14:10:05 crc kubenswrapper[4911]: I0310 14:10:05.012943 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552530-kshjj" event={"ID":"d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71","Type":"ContainerDied","Data":"83ed9432154e9ef4f4ace3f20977d32c6d6964a66811da07797e2525847c49e3"} Mar 10 14:10:05 crc kubenswrapper[4911]: I0310 14:10:05.012998 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83ed9432154e9ef4f4ace3f20977d32c6d6964a66811da07797e2525847c49e3" Mar 10 14:10:05 crc kubenswrapper[4911]: I0310 14:10:05.013041 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552530-kshjj" Mar 10 14:10:05 crc kubenswrapper[4911]: I0310 14:10:05.471627 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552524-98vcv"] Mar 10 14:10:05 crc kubenswrapper[4911]: I0310 14:10:05.475640 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552524-98vcv"] Mar 10 14:10:06 crc kubenswrapper[4911]: I0310 14:10:06.207992 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30365124-15de-458c-b8f8-97b0fab41da4" path="/var/lib/kubelet/pods/30365124-15de-458c-b8f8-97b0fab41da4/volumes" Mar 10 14:11:36 crc kubenswrapper[4911]: I0310 14:11:36.606862 4911 scope.go:117] "RemoveContainer" containerID="32231f2264d17f7fca1c7041ef8ceac92288228832e5591cef79f082073ca2ed" Mar 10 14:11:48 crc kubenswrapper[4911]: I0310 14:11:48.520956 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:11:48 crc kubenswrapper[4911]: I0310 14:11:48.521606 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.135399 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552532-ln5mn"] Mar 10 14:12:00 crc kubenswrapper[4911]: E0310 14:12:00.136344 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71" containerName="oc" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.136359 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71" containerName="oc" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.136494 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71" containerName="oc" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.136998 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552532-ln5mn" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.139445 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.139943 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.140907 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.146790 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552532-ln5mn"] Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.277318 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm8v2\" (UniqueName: \"kubernetes.io/projected/ac79edf7-7478-4f46-b74e-a3db1d75d52f-kube-api-access-wm8v2\") pod \"auto-csr-approver-29552532-ln5mn\" (UID: \"ac79edf7-7478-4f46-b74e-a3db1d75d52f\") " pod="openshift-infra/auto-csr-approver-29552532-ln5mn" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.378697 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm8v2\" (UniqueName: \"kubernetes.io/projected/ac79edf7-7478-4f46-b74e-a3db1d75d52f-kube-api-access-wm8v2\") pod \"auto-csr-approver-29552532-ln5mn\" (UID: \"ac79edf7-7478-4f46-b74e-a3db1d75d52f\") " pod="openshift-infra/auto-csr-approver-29552532-ln5mn" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.399268 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm8v2\" (UniqueName: \"kubernetes.io/projected/ac79edf7-7478-4f46-b74e-a3db1d75d52f-kube-api-access-wm8v2\") pod \"auto-csr-approver-29552532-ln5mn\" (UID: \"ac79edf7-7478-4f46-b74e-a3db1d75d52f\") " pod="openshift-infra/auto-csr-approver-29552532-ln5mn" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.454717 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552532-ln5mn" Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.635089 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552532-ln5mn"] Mar 10 14:12:00 crc kubenswrapper[4911]: I0310 14:12:00.646445 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:12:01 crc kubenswrapper[4911]: I0310 14:12:01.090215 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552532-ln5mn" event={"ID":"ac79edf7-7478-4f46-b74e-a3db1d75d52f","Type":"ContainerStarted","Data":"26de7d3ddc8b1d11e5cdee408c13020907d921e572270c2c225b6c43fe6c9300"} Mar 10 14:12:02 crc kubenswrapper[4911]: I0310 14:12:02.097302 4911 generic.go:334] "Generic (PLEG): container finished" podID="ac79edf7-7478-4f46-b74e-a3db1d75d52f" containerID="1612c726eac636d45591ce038742c082ec5a547f4043d8170fb59a22ede3e44c" exitCode=0 Mar 10 14:12:02 crc kubenswrapper[4911]: I0310 14:12:02.097346 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552532-ln5mn" event={"ID":"ac79edf7-7478-4f46-b74e-a3db1d75d52f","Type":"ContainerDied","Data":"1612c726eac636d45591ce038742c082ec5a547f4043d8170fb59a22ede3e44c"} Mar 10 14:12:03 crc kubenswrapper[4911]: I0310 14:12:03.317624 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552532-ln5mn" Mar 10 14:12:03 crc kubenswrapper[4911]: I0310 14:12:03.519446 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm8v2\" (UniqueName: \"kubernetes.io/projected/ac79edf7-7478-4f46-b74e-a3db1d75d52f-kube-api-access-wm8v2\") pod \"ac79edf7-7478-4f46-b74e-a3db1d75d52f\" (UID: \"ac79edf7-7478-4f46-b74e-a3db1d75d52f\") " Mar 10 14:12:03 crc kubenswrapper[4911]: I0310 14:12:03.525324 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac79edf7-7478-4f46-b74e-a3db1d75d52f-kube-api-access-wm8v2" (OuterVolumeSpecName: "kube-api-access-wm8v2") pod "ac79edf7-7478-4f46-b74e-a3db1d75d52f" (UID: "ac79edf7-7478-4f46-b74e-a3db1d75d52f"). InnerVolumeSpecName "kube-api-access-wm8v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:12:03 crc kubenswrapper[4911]: I0310 14:12:03.620674 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm8v2\" (UniqueName: \"kubernetes.io/projected/ac79edf7-7478-4f46-b74e-a3db1d75d52f-kube-api-access-wm8v2\") on node \"crc\" DevicePath \"\"" Mar 10 14:12:04 crc kubenswrapper[4911]: I0310 14:12:04.110541 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552532-ln5mn" event={"ID":"ac79edf7-7478-4f46-b74e-a3db1d75d52f","Type":"ContainerDied","Data":"26de7d3ddc8b1d11e5cdee408c13020907d921e572270c2c225b6c43fe6c9300"} Mar 10 14:12:04 crc kubenswrapper[4911]: I0310 14:12:04.110595 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26de7d3ddc8b1d11e5cdee408c13020907d921e572270c2c225b6c43fe6c9300" Mar 10 14:12:04 crc kubenswrapper[4911]: I0310 14:12:04.110620 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552532-ln5mn" Mar 10 14:12:04 crc kubenswrapper[4911]: I0310 14:12:04.378778 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552526-hfmpq"] Mar 10 14:12:04 crc kubenswrapper[4911]: I0310 14:12:04.382014 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552526-hfmpq"] Mar 10 14:12:06 crc kubenswrapper[4911]: I0310 14:12:06.200879 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90ef4fc-8e97-468a-b0ba-d7105067b50c" path="/var/lib/kubelet/pods/d90ef4fc-8e97-468a-b0ba-d7105067b50c/volumes" Mar 10 14:12:18 crc kubenswrapper[4911]: I0310 14:12:18.521518 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:12:18 crc kubenswrapper[4911]: I0310 14:12:18.522342 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:12:36 crc kubenswrapper[4911]: I0310 14:12:36.666353 4911 scope.go:117] "RemoveContainer" containerID="bf3efacd370b2cff9d688064dd0261963b117775fbf8ca51b9b199b509f410e4" Mar 10 14:12:48 crc kubenswrapper[4911]: I0310 14:12:48.521530 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:12:48 crc kubenswrapper[4911]: I0310 14:12:48.522304 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:12:48 crc kubenswrapper[4911]: I0310 14:12:48.522381 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:12:48 crc kubenswrapper[4911]: I0310 14:12:48.523449 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87a8887e64f64f70cc9bb63b42cf72e1e862391a02a6945fc6c8c609082e2a34"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:12:48 crc kubenswrapper[4911]: I0310 14:12:48.523551 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://87a8887e64f64f70cc9bb63b42cf72e1e862391a02a6945fc6c8c609082e2a34" gracePeriod=600 Mar 10 14:12:49 crc kubenswrapper[4911]: I0310 14:12:49.433460 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="87a8887e64f64f70cc9bb63b42cf72e1e862391a02a6945fc6c8c609082e2a34" exitCode=0 Mar 10 14:12:49 crc kubenswrapper[4911]: I0310 14:12:49.433570 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"87a8887e64f64f70cc9bb63b42cf72e1e862391a02a6945fc6c8c609082e2a34"} Mar 10 14:12:49 crc kubenswrapper[4911]: I0310 14:12:49.433926 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"857ad61597498c0292e86491603a433b330dd14022c00daae311c70410368529"} Mar 10 14:12:49 crc kubenswrapper[4911]: I0310 14:12:49.433968 4911 scope.go:117] "RemoveContainer" containerID="0bb21d0d9028cc517f2b21aadcd53128eee7aee107df38fbfad5959639c7f688" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.141628 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552534-8wvx4"] Mar 10 14:14:00 crc kubenswrapper[4911]: E0310 14:14:00.142775 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac79edf7-7478-4f46-b74e-a3db1d75d52f" containerName="oc" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.142792 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac79edf7-7478-4f46-b74e-a3db1d75d52f" containerName="oc" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.142923 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac79edf7-7478-4f46-b74e-a3db1d75d52f" containerName="oc" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.143486 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552534-8wvx4" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.146400 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.146865 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.147254 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552534-8wvx4"] Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.148082 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.303137 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8w5\" (UniqueName: \"kubernetes.io/projected/ef59c911-45bc-4848-8c29-f63b38053e1d-kube-api-access-tw8w5\") pod \"auto-csr-approver-29552534-8wvx4\" (UID: \"ef59c911-45bc-4848-8c29-f63b38053e1d\") " pod="openshift-infra/auto-csr-approver-29552534-8wvx4" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.404645 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8w5\" (UniqueName: \"kubernetes.io/projected/ef59c911-45bc-4848-8c29-f63b38053e1d-kube-api-access-tw8w5\") pod \"auto-csr-approver-29552534-8wvx4\" (UID: \"ef59c911-45bc-4848-8c29-f63b38053e1d\") " pod="openshift-infra/auto-csr-approver-29552534-8wvx4" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.424467 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8w5\" (UniqueName: \"kubernetes.io/projected/ef59c911-45bc-4848-8c29-f63b38053e1d-kube-api-access-tw8w5\") pod \"auto-csr-approver-29552534-8wvx4\" (UID: \"ef59c911-45bc-4848-8c29-f63b38053e1d\") " pod="openshift-infra/auto-csr-approver-29552534-8wvx4" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.460057 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552534-8wvx4" Mar 10 14:14:00 crc kubenswrapper[4911]: I0310 14:14:00.647244 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552534-8wvx4"] Mar 10 14:14:01 crc kubenswrapper[4911]: I0310 14:14:01.450538 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552534-8wvx4" event={"ID":"ef59c911-45bc-4848-8c29-f63b38053e1d","Type":"ContainerStarted","Data":"dd672936d7e13f9cabf6082c1dfe268fd113914c70856865a7e756da1e8351c5"} Mar 10 14:14:02 crc kubenswrapper[4911]: I0310 14:14:02.463997 4911 generic.go:334] "Generic (PLEG): container finished" podID="ef59c911-45bc-4848-8c29-f63b38053e1d" containerID="731bdd285ad9b972671a61b855174efa743f141ab1c46d2e9c0f79117ad2355a" exitCode=0 Mar 10 14:14:02 crc kubenswrapper[4911]: I0310 14:14:02.464125 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552534-8wvx4" event={"ID":"ef59c911-45bc-4848-8c29-f63b38053e1d","Type":"ContainerDied","Data":"731bdd285ad9b972671a61b855174efa743f141ab1c46d2e9c0f79117ad2355a"} Mar 10 14:14:03 crc kubenswrapper[4911]: I0310 14:14:03.688908 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552534-8wvx4" Mar 10 14:14:03 crc kubenswrapper[4911]: I0310 14:14:03.857424 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw8w5\" (UniqueName: \"kubernetes.io/projected/ef59c911-45bc-4848-8c29-f63b38053e1d-kube-api-access-tw8w5\") pod \"ef59c911-45bc-4848-8c29-f63b38053e1d\" (UID: \"ef59c911-45bc-4848-8c29-f63b38053e1d\") " Mar 10 14:14:03 crc kubenswrapper[4911]: I0310 14:14:03.864008 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef59c911-45bc-4848-8c29-f63b38053e1d-kube-api-access-tw8w5" (OuterVolumeSpecName: "kube-api-access-tw8w5") pod "ef59c911-45bc-4848-8c29-f63b38053e1d" (UID: "ef59c911-45bc-4848-8c29-f63b38053e1d"). InnerVolumeSpecName "kube-api-access-tw8w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:14:03 crc kubenswrapper[4911]: I0310 14:14:03.959773 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw8w5\" (UniqueName: \"kubernetes.io/projected/ef59c911-45bc-4848-8c29-f63b38053e1d-kube-api-access-tw8w5\") on node \"crc\" DevicePath \"\"" Mar 10 14:14:04 crc kubenswrapper[4911]: I0310 14:14:04.476258 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552534-8wvx4" event={"ID":"ef59c911-45bc-4848-8c29-f63b38053e1d","Type":"ContainerDied","Data":"dd672936d7e13f9cabf6082c1dfe268fd113914c70856865a7e756da1e8351c5"} Mar 10 14:14:04 crc kubenswrapper[4911]: I0310 14:14:04.476317 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd672936d7e13f9cabf6082c1dfe268fd113914c70856865a7e756da1e8351c5" Mar 10 14:14:04 crc kubenswrapper[4911]: I0310 14:14:04.476341 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552534-8wvx4" Mar 10 14:14:04 crc kubenswrapper[4911]: I0310 14:14:04.748437 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552528-m6klh"] Mar 10 14:14:04 crc kubenswrapper[4911]: I0310 14:14:04.751536 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552528-m6klh"] Mar 10 14:14:06 crc kubenswrapper[4911]: I0310 14:14:06.201325 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc69578-9347-4984-af7d-e05aa9abd29d" path="/var/lib/kubelet/pods/bbc69578-9347-4984-af7d-e05aa9abd29d/volumes" Mar 10 14:14:36 crc kubenswrapper[4911]: I0310 14:14:36.773316 4911 scope.go:117] "RemoveContainer" containerID="f8f6544ff8e5585502ae5f1078798e5b9927b2ad0e12ba5072382cf891245b7e" Mar 10 14:14:48 crc kubenswrapper[4911]: I0310 14:14:48.521331 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:14:48 crc kubenswrapper[4911]: I0310 14:14:48.522838 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.149437 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb"] Mar 10 14:15:00 crc kubenswrapper[4911]: E0310 14:15:00.150197 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef59c911-45bc-4848-8c29-f63b38053e1d" containerName="oc" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.150218 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef59c911-45bc-4848-8c29-f63b38053e1d" containerName="oc" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.150380 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef59c911-45bc-4848-8c29-f63b38053e1d" containerName="oc" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.151451 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.155779 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.156305 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.164682 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb"] Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.250340 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29a4b10-f12c-47df-9061-c208883696aa-secret-volume\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.250563 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29a4b10-f12c-47df-9061-c208883696aa-config-volume\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.251312 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ptl\" (UniqueName: \"kubernetes.io/projected/f29a4b10-f12c-47df-9061-c208883696aa-kube-api-access-d5ptl\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.351811 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29a4b10-f12c-47df-9061-c208883696aa-secret-volume\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.351889 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29a4b10-f12c-47df-9061-c208883696aa-config-volume\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.351917 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ptl\" (UniqueName: \"kubernetes.io/projected/f29a4b10-f12c-47df-9061-c208883696aa-kube-api-access-d5ptl\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.353592 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29a4b10-f12c-47df-9061-c208883696aa-config-volume\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.363878 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29a4b10-f12c-47df-9061-c208883696aa-secret-volume\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.371844 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ptl\" (UniqueName: \"kubernetes.io/projected/f29a4b10-f12c-47df-9061-c208883696aa-kube-api-access-d5ptl\") pod \"collect-profiles-29552535-69fkb\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.469832 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.667256 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb"] Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.849710 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" event={"ID":"f29a4b10-f12c-47df-9061-c208883696aa","Type":"ContainerStarted","Data":"a8a3061e1682af79f2d778689fa9ff66eb1275dcb7caab2f3649ac3aef929731"} Mar 10 14:15:00 crc kubenswrapper[4911]: I0310 14:15:00.849784 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" event={"ID":"f29a4b10-f12c-47df-9061-c208883696aa","Type":"ContainerStarted","Data":"96558ac2ca1ef3fe59dc8f1a12138185e69fb5d6ddc982348ddd3f6218d8e7ae"} Mar 10 14:15:01 crc kubenswrapper[4911]: I0310 14:15:01.867704 4911 generic.go:334] "Generic (PLEG): container finished" podID="f29a4b10-f12c-47df-9061-c208883696aa" containerID="a8a3061e1682af79f2d778689fa9ff66eb1275dcb7caab2f3649ac3aef929731" exitCode=0 Mar 10 14:15:01 crc kubenswrapper[4911]: I0310 14:15:01.867810 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" event={"ID":"f29a4b10-f12c-47df-9061-c208883696aa","Type":"ContainerDied","Data":"a8a3061e1682af79f2d778689fa9ff66eb1275dcb7caab2f3649ac3aef929731"} Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.068574 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.191435 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29a4b10-f12c-47df-9061-c208883696aa-secret-volume\") pod \"f29a4b10-f12c-47df-9061-c208883696aa\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.192019 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5ptl\" (UniqueName: \"kubernetes.io/projected/f29a4b10-f12c-47df-9061-c208883696aa-kube-api-access-d5ptl\") pod \"f29a4b10-f12c-47df-9061-c208883696aa\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.192076 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29a4b10-f12c-47df-9061-c208883696aa-config-volume\") pod \"f29a4b10-f12c-47df-9061-c208883696aa\" (UID: \"f29a4b10-f12c-47df-9061-c208883696aa\") " Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.192774 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a4b10-f12c-47df-9061-c208883696aa-config-volume" (OuterVolumeSpecName: "config-volume") pod "f29a4b10-f12c-47df-9061-c208883696aa" (UID: "f29a4b10-f12c-47df-9061-c208883696aa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.196778 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29a4b10-f12c-47df-9061-c208883696aa-kube-api-access-d5ptl" (OuterVolumeSpecName: "kube-api-access-d5ptl") pod "f29a4b10-f12c-47df-9061-c208883696aa" (UID: "f29a4b10-f12c-47df-9061-c208883696aa"). InnerVolumeSpecName "kube-api-access-d5ptl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.196800 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29a4b10-f12c-47df-9061-c208883696aa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f29a4b10-f12c-47df-9061-c208883696aa" (UID: "f29a4b10-f12c-47df-9061-c208883696aa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.293317 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29a4b10-f12c-47df-9061-c208883696aa-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.294385 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29a4b10-f12c-47df-9061-c208883696aa-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.294832 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5ptl\" (UniqueName: \"kubernetes.io/projected/f29a4b10-f12c-47df-9061-c208883696aa-kube-api-access-d5ptl\") on node \"crc\" DevicePath \"\"" Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.882657 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" event={"ID":"f29a4b10-f12c-47df-9061-c208883696aa","Type":"ContainerDied","Data":"96558ac2ca1ef3fe59dc8f1a12138185e69fb5d6ddc982348ddd3f6218d8e7ae"} Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.882706 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96558ac2ca1ef3fe59dc8f1a12138185e69fb5d6ddc982348ddd3f6218d8e7ae" Mar 10 14:15:03 crc kubenswrapper[4911]: I0310 14:15:03.882717 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb" Mar 10 14:15:18 crc kubenswrapper[4911]: I0310 14:15:18.520425 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:15:18 crc kubenswrapper[4911]: I0310 14:15:18.521038 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:15:23 crc kubenswrapper[4911]: I0310 14:15:23.003453 4911 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 14:15:48 crc kubenswrapper[4911]: I0310 14:15:48.521370 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:15:48 crc kubenswrapper[4911]: I0310 14:15:48.522398 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:15:48 crc kubenswrapper[4911]: I0310 14:15:48.522496 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:15:48 crc kubenswrapper[4911]: I0310 14:15:48.523667 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"857ad61597498c0292e86491603a433b330dd14022c00daae311c70410368529"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:15:48 crc kubenswrapper[4911]: I0310 14:15:48.523830 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://857ad61597498c0292e86491603a433b330dd14022c00daae311c70410368529" gracePeriod=600 Mar 10 14:15:49 crc kubenswrapper[4911]: I0310 14:15:49.227910 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="857ad61597498c0292e86491603a433b330dd14022c00daae311c70410368529" exitCode=0 Mar 10 14:15:49 crc kubenswrapper[4911]: I0310 14:15:49.228000 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"857ad61597498c0292e86491603a433b330dd14022c00daae311c70410368529"} Mar 10 14:15:49 crc kubenswrapper[4911]: I0310 14:15:49.228507 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"064f54de59fb1087deb1f06362fea8b7318f6c645504d0d54010b3ae33528b2f"} Mar 10 14:15:49 crc kubenswrapper[4911]: I0310 14:15:49.228548 4911 scope.go:117] "RemoveContainer" containerID="87a8887e64f64f70cc9bb63b42cf72e1e862391a02a6945fc6c8c609082e2a34" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.509273 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t"] Mar 10 14:15:50 crc kubenswrapper[4911]: E0310 14:15:50.510113 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29a4b10-f12c-47df-9061-c208883696aa" containerName="collect-profiles" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.510127 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29a4b10-f12c-47df-9061-c208883696aa" containerName="collect-profiles" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.510216 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29a4b10-f12c-47df-9061-c208883696aa" containerName="collect-profiles" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.510693 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.513698 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.513746 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.517544 4911 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8nr27" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.519998 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-t952c"] Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.520868 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-t952c" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.524934 4911 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-72g5k" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.532289 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t"] Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.536318 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-t952c"] Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.556289 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2fm2x"] Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.564231 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.567011 4911 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-62px7" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.574777 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2fm2x"] Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.646398 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9fmx\" (UniqueName: \"kubernetes.io/projected/90feb50a-5cbc-4a77-b328-65b1f3adefc0-kube-api-access-v9fmx\") pod \"cert-manager-webhook-687f57d79b-2fm2x\" (UID: \"90feb50a-5cbc-4a77-b328-65b1f3adefc0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.646483 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bsv6\" (UniqueName: \"kubernetes.io/projected/b05a33fd-fd6b-4b1b-ad0f-427586c8e81a-kube-api-access-4bsv6\") pod \"cert-manager-858654f9db-t952c\" (UID: \"b05a33fd-fd6b-4b1b-ad0f-427586c8e81a\") " pod="cert-manager/cert-manager-858654f9db-t952c" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.646548 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt4c4\" (UniqueName: \"kubernetes.io/projected/8d1ebd76-111d-461e-8031-13d1071d1e64-kube-api-access-zt4c4\") pod \"cert-manager-cainjector-cf98fcc89-5bw5t\" (UID: \"8d1ebd76-111d-461e-8031-13d1071d1e64\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.749123 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bsv6\" (UniqueName: \"kubernetes.io/projected/b05a33fd-fd6b-4b1b-ad0f-427586c8e81a-kube-api-access-4bsv6\") pod \"cert-manager-858654f9db-t952c\" (UID: \"b05a33fd-fd6b-4b1b-ad0f-427586c8e81a\") " pod="cert-manager/cert-manager-858654f9db-t952c" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.749208 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt4c4\" (UniqueName: \"kubernetes.io/projected/8d1ebd76-111d-461e-8031-13d1071d1e64-kube-api-access-zt4c4\") pod \"cert-manager-cainjector-cf98fcc89-5bw5t\" (UID: \"8d1ebd76-111d-461e-8031-13d1071d1e64\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.749233 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9fmx\" (UniqueName: \"kubernetes.io/projected/90feb50a-5cbc-4a77-b328-65b1f3adefc0-kube-api-access-v9fmx\") pod \"cert-manager-webhook-687f57d79b-2fm2x\" (UID: \"90feb50a-5cbc-4a77-b328-65b1f3adefc0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.769678 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bsv6\" (UniqueName: \"kubernetes.io/projected/b05a33fd-fd6b-4b1b-ad0f-427586c8e81a-kube-api-access-4bsv6\") pod \"cert-manager-858654f9db-t952c\" (UID: \"b05a33fd-fd6b-4b1b-ad0f-427586c8e81a\") " pod="cert-manager/cert-manager-858654f9db-t952c" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.769885 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9fmx\" (UniqueName: \"kubernetes.io/projected/90feb50a-5cbc-4a77-b328-65b1f3adefc0-kube-api-access-v9fmx\") pod \"cert-manager-webhook-687f57d79b-2fm2x\" (UID: \"90feb50a-5cbc-4a77-b328-65b1f3adefc0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.772650 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt4c4\" (UniqueName: \"kubernetes.io/projected/8d1ebd76-111d-461e-8031-13d1071d1e64-kube-api-access-zt4c4\") pod \"cert-manager-cainjector-cf98fcc89-5bw5t\" (UID: \"8d1ebd76-111d-461e-8031-13d1071d1e64\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.835708 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.848322 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-t952c" Mar 10 14:15:50 crc kubenswrapper[4911]: I0310 14:15:50.888110 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" Mar 10 14:15:51 crc kubenswrapper[4911]: I0310 14:15:51.062598 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-t952c"] Mar 10 14:15:51 crc kubenswrapper[4911]: W0310 14:15:51.072165 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb05a33fd_fd6b_4b1b_ad0f_427586c8e81a.slice/crio-0f662096e65d6b39637e78543ace1f3877050196f5780f75f0269835f51ab903 WatchSource:0}: Error finding container 0f662096e65d6b39637e78543ace1f3877050196f5780f75f0269835f51ab903: Status 404 returned error can't find the container with id 0f662096e65d6b39637e78543ace1f3877050196f5780f75f0269835f51ab903 Mar 10 14:15:51 crc kubenswrapper[4911]: W0310 14:15:51.095453 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d1ebd76_111d_461e_8031_13d1071d1e64.slice/crio-9a0978110bbd6fc280507ba717b16a1f2b4a058ede56e351ebef999434586442 WatchSource:0}: Error finding container 9a0978110bbd6fc280507ba717b16a1f2b4a058ede56e351ebef999434586442: Status 404 returned error can't find the container with id 9a0978110bbd6fc280507ba717b16a1f2b4a058ede56e351ebef999434586442 Mar 10 14:15:51 crc kubenswrapper[4911]: I0310 14:15:51.095655 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t"] Mar 10 14:15:51 crc kubenswrapper[4911]: I0310 14:15:51.247070 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-t952c" event={"ID":"b05a33fd-fd6b-4b1b-ad0f-427586c8e81a","Type":"ContainerStarted","Data":"0f662096e65d6b39637e78543ace1f3877050196f5780f75f0269835f51ab903"} Mar 10 14:15:51 crc kubenswrapper[4911]: I0310 14:15:51.248317 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t" event={"ID":"8d1ebd76-111d-461e-8031-13d1071d1e64","Type":"ContainerStarted","Data":"9a0978110bbd6fc280507ba717b16a1f2b4a058ede56e351ebef999434586442"} Mar 10 14:15:51 crc kubenswrapper[4911]: I0310 14:15:51.412923 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2fm2x"] Mar 10 14:15:51 crc kubenswrapper[4911]: W0310 14:15:51.418282 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90feb50a_5cbc_4a77_b328_65b1f3adefc0.slice/crio-e97f7095bcfd6bf996a11b85fc72cda3d133d5ec8038e69dfa7d0a346e78d846 WatchSource:0}: Error finding container e97f7095bcfd6bf996a11b85fc72cda3d133d5ec8038e69dfa7d0a346e78d846: Status 404 returned error can't find the container with id e97f7095bcfd6bf996a11b85fc72cda3d133d5ec8038e69dfa7d0a346e78d846 Mar 10 14:15:52 crc kubenswrapper[4911]: I0310 14:15:52.262647 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" event={"ID":"90feb50a-5cbc-4a77-b328-65b1f3adefc0","Type":"ContainerStarted","Data":"e97f7095bcfd6bf996a11b85fc72cda3d133d5ec8038e69dfa7d0a346e78d846"} Mar 10 14:15:54 crc kubenswrapper[4911]: I0310 14:15:54.276218 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-t952c" event={"ID":"b05a33fd-fd6b-4b1b-ad0f-427586c8e81a","Type":"ContainerStarted","Data":"630bffe94adf58f9fb644baf8bd40968c887df5509dc1e2b37eacdf517becbd0"} Mar 10 14:15:54 crc kubenswrapper[4911]: I0310 14:15:54.298110 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-t952c" podStartSLOduration=1.8874581689999999 podStartE2EDuration="4.298081997s" podCreationTimestamp="2026-03-10 14:15:50 +0000 UTC" firstStartedPulling="2026-03-10 14:15:51.078254624 +0000 UTC m=+855.641774541" lastFinishedPulling="2026-03-10 14:15:53.488878452 +0000 UTC m=+858.052398369" observedRunningTime="2026-03-10 14:15:54.292429795 +0000 UTC m=+858.855949712" watchObservedRunningTime="2026-03-10 14:15:54.298081997 +0000 UTC m=+858.861601914" Mar 10 14:15:55 crc kubenswrapper[4911]: I0310 14:15:55.284714 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t" event={"ID":"8d1ebd76-111d-461e-8031-13d1071d1e64","Type":"ContainerStarted","Data":"e677119c87e2385b9faeb9b673ac0cf9fda0f15d59c283096171fdaec1812646"} Mar 10 14:15:55 crc kubenswrapper[4911]: I0310 14:15:55.287415 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" event={"ID":"90feb50a-5cbc-4a77-b328-65b1f3adefc0","Type":"ContainerStarted","Data":"fbfb4bee3bf39925194b75a5d5b6a3e321c54701cf7c4317d22ef4d7efcb88c0"} Mar 10 14:15:55 crc kubenswrapper[4911]: I0310 14:15:55.287567 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" Mar 10 14:15:55 crc kubenswrapper[4911]: I0310 14:15:55.301680 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5bw5t" podStartSLOduration=1.7799127540000002 podStartE2EDuration="5.301659904s" podCreationTimestamp="2026-03-10 14:15:50 +0000 UTC" firstStartedPulling="2026-03-10 14:15:51.098024004 +0000 UTC m=+855.661543921" lastFinishedPulling="2026-03-10 14:15:54.619771154 +0000 UTC m=+859.183291071" observedRunningTime="2026-03-10 14:15:55.299601508 +0000 UTC m=+859.863121425" watchObservedRunningTime="2026-03-10 14:15:55.301659904 +0000 UTC m=+859.865179821" Mar 10 14:15:55 crc kubenswrapper[4911]: I0310 14:15:55.317421 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" podStartSLOduration=2.068485344 podStartE2EDuration="5.317395126s" podCreationTimestamp="2026-03-10 14:15:50 +0000 UTC" firstStartedPulling="2026-03-10 14:15:51.421546681 +0000 UTC m=+855.985066598" lastFinishedPulling="2026-03-10 14:15:54.670456443 +0000 UTC m=+859.233976380" observedRunningTime="2026-03-10 14:15:55.316756499 +0000 UTC m=+859.880276416" watchObservedRunningTime="2026-03-10 14:15:55.317395126 +0000 UTC m=+859.880915043" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.022063 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4256n"] Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.022887 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovn-controller" containerID="cri-o://a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e" gracePeriod=30 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.022947 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="sbdb" containerID="cri-o://5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39" gracePeriod=30 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.023017 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kube-rbac-proxy-node" containerID="cri-o://87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f" gracePeriod=30 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.023091 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="nbdb" containerID="cri-o://0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89" gracePeriod=30 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.023179 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="northd" containerID="cri-o://55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1" gracePeriod=30 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.023252 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovn-acl-logging" containerID="cri-o://26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173" gracePeriod=30 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.023669 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e" gracePeriod=30 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.120614 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" containerID="cri-o://d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803" gracePeriod=30 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.158012 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552536-jnbwk"] Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.158994 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.162436 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.162623 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.162634 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.284506 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97wvs\" (UniqueName: \"kubernetes.io/projected/af1f4b1e-ab3f-4d2b-b2d1-185188a4795c-kube-api-access-97wvs\") pod \"auto-csr-approver-29552536-jnbwk\" (UID: \"af1f4b1e-ab3f-4d2b-b2d1-185188a4795c\") " pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.322765 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/2.log" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.323416 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/1.log" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.323474 4911 generic.go:334] "Generic (PLEG): container finished" podID="fc662696-d402-4969-bebd-00fa42e63075" containerID="98f567cea8a7526e2d641b378c91a49bdbfd2a2fe2a7dae62170af48d16b4ae1" exitCode=2 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.323543 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nsxjn" event={"ID":"fc662696-d402-4969-bebd-00fa42e63075","Type":"ContainerDied","Data":"98f567cea8a7526e2d641b378c91a49bdbfd2a2fe2a7dae62170af48d16b4ae1"} Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.323591 4911 scope.go:117] "RemoveContainer" containerID="434ea388fa647987cf9df2476b4fcd253c62eb5fcb0193565eb08ab8e0cbcd09" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.324382 4911 scope.go:117] "RemoveContainer" containerID="98f567cea8a7526e2d641b378c91a49bdbfd2a2fe2a7dae62170af48d16b4ae1" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.329316 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovnkube-controller/3.log" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.352470 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovn-acl-logging/0.log" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.354244 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovn-controller/0.log" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.354778 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803" exitCode=0 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.354834 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1" exitCode=0 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.354850 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e" exitCode=0 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.354860 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f" exitCode=0 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.354869 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173" exitCode=143 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.354877 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e" exitCode=143 Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.354924 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803"} Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.356839 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1"} Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.356867 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e"} Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.356881 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f"} Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.356918 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173"} Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.356933 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e"} Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.385943 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97wvs\" (UniqueName: \"kubernetes.io/projected/af1f4b1e-ab3f-4d2b-b2d1-185188a4795c-kube-api-access-97wvs\") pod \"auto-csr-approver-29552536-jnbwk\" (UID: \"af1f4b1e-ab3f-4d2b-b2d1-185188a4795c\") " pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.392063 4911 scope.go:117] "RemoveContainer" containerID="79e74437cf4b8019d63421cd931b44efb6a1bee427eb3082f453faed5f18aa6a" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.408027 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97wvs\" (UniqueName: \"kubernetes.io/projected/af1f4b1e-ab3f-4d2b-b2d1-185188a4795c-kube-api-access-97wvs\") pod \"auto-csr-approver-29552536-jnbwk\" (UID: \"af1f4b1e-ab3f-4d2b-b2d1-185188a4795c\") " pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.412995 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovn-acl-logging/0.log" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.413394 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovn-controller/0.log" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.413801 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.485297 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w5dct"] Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.485828 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kube-rbac-proxy-node" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.485852 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kube-rbac-proxy-node" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.485882 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.485891 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.485899 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.485907 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.485919 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.485926 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.485939 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovn-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.485966 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovn-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.485975 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kubecfg-setup" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.485983 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kubecfg-setup" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.485991 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.485998 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.486006 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486013 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.486026 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486033 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.486046 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="sbdb" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486052 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="sbdb" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.486064 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovn-acl-logging" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486071 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovn-acl-logging" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.486081 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="nbdb" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486087 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="nbdb" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.486099 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="northd" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486107 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="northd" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486245 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kube-rbac-proxy-node" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486256 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486266 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486281 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486291 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovn-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486299 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="nbdb" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486308 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovn-acl-logging" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486318 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="sbdb" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486328 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="northd" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486336 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486653 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.486665 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" containerName="ovnkube-controller" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.489326 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.583291 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588626 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-netd\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588681 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed2b430b-2281-4231-9135-f0289be08cdd-ovn-node-metrics-cert\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588704 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-bin\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588741 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-etc-openvswitch\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588777 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-var-lib-openvswitch\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588795 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-kubelet\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588816 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-ovn-kubernetes\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588845 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-script-lib\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588872 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-openvswitch\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588903 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-ovn\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588932 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh7d8\" (UniqueName: \"kubernetes.io/projected/ed2b430b-2281-4231-9135-f0289be08cdd-kube-api-access-rh7d8\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588952 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-slash\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588976 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-systemd\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.588994 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-netns\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589028 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-node-log\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589063 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-env-overrides\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589082 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-systemd-units\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589103 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589123 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-log-socket\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589145 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-config\") pod \"ed2b430b-2281-4231-9135-f0289be08cdd\" (UID: \"ed2b430b-2281-4231-9135-f0289be08cdd\") " Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589486 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589808 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589865 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589900 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589910 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-slash" (OuterVolumeSpecName: "host-slash") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.589926 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590028 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590094 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590545 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590591 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590605 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-node-log" (OuterVolumeSpecName: "node-log") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590656 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590697 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590753 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-log-socket" (OuterVolumeSpecName: "log-socket") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.590776 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.591609 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.592973 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2b430b-2281-4231-9135-f0289be08cdd-kube-api-access-rh7d8" (OuterVolumeSpecName: "kube-api-access-rh7d8") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "kube-api-access-rh7d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.598800 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2b430b-2281-4231-9135-f0289be08cdd-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.614974 4911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552536-jnbwk_openshift-infra_af1f4b1e-ab3f-4d2b-b2d1-185188a4795c_0(1192d5eb101dea50731fcf7e72d93fd9519d6ea80b8cdfed2f4b985d9c7b32f2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.615104 4911 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552536-jnbwk_openshift-infra_af1f4b1e-ab3f-4d2b-b2d1-185188a4795c_0(1192d5eb101dea50731fcf7e72d93fd9519d6ea80b8cdfed2f4b985d9c7b32f2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.615150 4911 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552536-jnbwk_openshift-infra_af1f4b1e-ab3f-4d2b-b2d1-185188a4795c_0(1192d5eb101dea50731fcf7e72d93fd9519d6ea80b8cdfed2f4b985d9c7b32f2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:00 crc kubenswrapper[4911]: E0310 14:16:00.615235 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29552536-jnbwk_openshift-infra(af1f4b1e-ab3f-4d2b-b2d1-185188a4795c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29552536-jnbwk_openshift-infra(af1f4b1e-ab3f-4d2b-b2d1-185188a4795c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552536-jnbwk_openshift-infra_af1f4b1e-ab3f-4d2b-b2d1-185188a4795c_0(1192d5eb101dea50731fcf7e72d93fd9519d6ea80b8cdfed2f4b985d9c7b32f2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" podUID="af1f4b1e-ab3f-4d2b-b2d1-185188a4795c" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.617361 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ed2b430b-2281-4231-9135-f0289be08cdd" (UID: "ed2b430b-2281-4231-9135-f0289be08cdd"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690634 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-etc-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690717 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-cni-netd\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690771 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690799 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-kubelet\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690824 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-ovn\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690845 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-ovnkube-config\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690866 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690894 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-run-netns\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690915 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-env-overrides\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690948 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-systemd-units\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690968 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-ovnkube-script-lib\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.690992 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtm24\" (UniqueName: \"kubernetes.io/projected/4bf34236-ff13-45ec-8f92-c055f7151bab-kube-api-access-wtm24\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.691196 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4bf34236-ff13-45ec-8f92-c055f7151bab-ovn-node-metrics-cert\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.691261 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-node-log\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.691393 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-log-socket\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.691433 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-systemd\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.691462 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-cni-bin\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.691499 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-var-lib-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692296 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692379 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-slash\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692556 4911 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692577 4911 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692589 4911 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692600 4911 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692610 4911 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692621 4911 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692630 4911 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692639 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692649 4911 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692661 4911 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed2b430b-2281-4231-9135-f0289be08cdd-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692669 4911 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692682 4911 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692692 4911 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692701 4911 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692710 4911 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692742 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed2b430b-2281-4231-9135-f0289be08cdd-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692755 4911 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692763 4911 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692780 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh7d8\" (UniqueName: \"kubernetes.io/projected/ed2b430b-2281-4231-9135-f0289be08cdd-kube-api-access-rh7d8\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.692789 4911 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed2b430b-2281-4231-9135-f0289be08cdd-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794196 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-slash\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794334 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-etc-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794394 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-cni-netd\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794425 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-kubelet\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794450 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794477 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-ovn\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794506 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-ovnkube-config\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794535 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794567 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-run-netns\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794583 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-etc-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794589 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-env-overrides\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.794884 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-run-netns\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795015 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-systemd-units\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795053 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-slash\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795066 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-ovnkube-script-lib\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795103 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-cni-netd\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795112 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtm24\" (UniqueName: \"kubernetes.io/projected/4bf34236-ff13-45ec-8f92-c055f7151bab-kube-api-access-wtm24\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795172 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-ovn\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795221 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4bf34236-ff13-45ec-8f92-c055f7151bab-ovn-node-metrics-cert\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795261 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-node-log\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795028 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-kubelet\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795332 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-log-socket\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795371 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-systemd\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795423 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-cni-bin\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795470 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-var-lib-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795498 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-log-socket\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795520 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795562 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795574 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-systemd-units\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795613 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-var-lib-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795660 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795681 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-host-cni-bin\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795708 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-systemd\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795798 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-node-log\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.795964 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-ovnkube-script-lib\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.796363 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-env-overrides\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.796479 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4bf34236-ff13-45ec-8f92-c055f7151bab-run-openvswitch\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.798009 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4bf34236-ff13-45ec-8f92-c055f7151bab-ovnkube-config\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.800408 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4bf34236-ff13-45ec-8f92-c055f7151bab-ovn-node-metrics-cert\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.826562 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtm24\" (UniqueName: \"kubernetes.io/projected/4bf34236-ff13-45ec-8f92-c055f7151bab-kube-api-access-wtm24\") pod \"ovnkube-node-w5dct\" (UID: \"4bf34236-ff13-45ec-8f92-c055f7151bab\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:00 crc kubenswrapper[4911]: I0310 14:16:00.892665 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-2fm2x" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.107678 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.364162 4911 generic.go:334] "Generic (PLEG): container finished" podID="4bf34236-ff13-45ec-8f92-c055f7151bab" containerID="8d4aecf8f70ea2fbc58ebdeb088cff54d760c55c565dd5811bc5448bb13c172c" exitCode=0 Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.364325 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerDied","Data":"8d4aecf8f70ea2fbc58ebdeb088cff54d760c55c565dd5811bc5448bb13c172c"} Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.365052 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"251f969a1a8fbd241d2691258b71abac81f4a35156ec766ea4a8965af72d00f3"} Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.374134 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovn-acl-logging/0.log" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.374779 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ed2b430b-2281-4231-9135-f0289be08cdd/ovn-controller/0.log" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.375663 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39" exitCode=0 Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.375700 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed2b430b-2281-4231-9135-f0289be08cdd" containerID="0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89" exitCode=0 Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.375788 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39"} Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.375824 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89"} Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.375841 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ed2b430b-2281-4231-9135-f0289be08cdd","Type":"ContainerDied","Data":"91bcd441701cce09d4b69201e187d85ba797da190cfb8f2d6e579f1921869e72"} Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.375845 4911 scope.go:117] "RemoveContainer" containerID="d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.375976 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.386339 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nsxjn_fc662696-d402-4969-bebd-00fa42e63075/kube-multus/2.log" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.386429 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nsxjn" event={"ID":"fc662696-d402-4969-bebd-00fa42e63075","Type":"ContainerStarted","Data":"ab0199d04b0df0da7eb01269f5c68212481419f611d784913ef0636d5215a491"} Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.418268 4911 scope.go:117] "RemoveContainer" containerID="5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.461031 4911 scope.go:117] "RemoveContainer" containerID="0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.480480 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4256n"] Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.484034 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4256n"] Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.496443 4911 scope.go:117] "RemoveContainer" containerID="55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.514304 4911 scope.go:117] "RemoveContainer" containerID="b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.532245 4911 scope.go:117] "RemoveContainer" containerID="87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.549447 4911 scope.go:117] "RemoveContainer" containerID="26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.582942 4911 scope.go:117] "RemoveContainer" containerID="a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.631906 4911 scope.go:117] "RemoveContainer" containerID="a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.650070 4911 scope.go:117] "RemoveContainer" containerID="d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.650767 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803\": container with ID starting with d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803 not found: ID does not exist" containerID="d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.650810 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803"} err="failed to get container status \"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803\": rpc error: code = NotFound desc = could not find container \"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803\": container with ID starting with d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.650846 4911 scope.go:117] "RemoveContainer" containerID="5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.651321 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\": container with ID starting with 5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39 not found: ID does not exist" containerID="5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.651353 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39"} err="failed to get container status \"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\": rpc error: code = NotFound desc = could not find container \"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\": container with ID starting with 5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.651374 4911 scope.go:117] "RemoveContainer" containerID="0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.651694 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\": container with ID starting with 0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89 not found: ID does not exist" containerID="0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.651737 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89"} err="failed to get container status \"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\": rpc error: code = NotFound desc = could not find container \"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\": container with ID starting with 0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.651756 4911 scope.go:117] "RemoveContainer" containerID="55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.652116 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\": container with ID starting with 55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1 not found: ID does not exist" containerID="55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.652157 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1"} err="failed to get container status \"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\": rpc error: code = NotFound desc = could not find container \"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\": container with ID starting with 55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.652178 4911 scope.go:117] "RemoveContainer" containerID="b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.652546 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\": container with ID starting with b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e not found: ID does not exist" containerID="b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.652575 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e"} err="failed to get container status \"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\": rpc error: code = NotFound desc = could not find container \"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\": container with ID starting with b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.652590 4911 scope.go:117] "RemoveContainer" containerID="87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.652869 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\": container with ID starting with 87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f not found: ID does not exist" containerID="87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.652901 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f"} err="failed to get container status \"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\": rpc error: code = NotFound desc = could not find container \"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\": container with ID starting with 87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.652922 4911 scope.go:117] "RemoveContainer" containerID="26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.653212 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\": container with ID starting with 26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173 not found: ID does not exist" containerID="26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.653233 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173"} err="failed to get container status \"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\": rpc error: code = NotFound desc = could not find container \"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\": container with ID starting with 26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.653247 4911 scope.go:117] "RemoveContainer" containerID="a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.653659 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\": container with ID starting with a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e not found: ID does not exist" containerID="a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.653682 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e"} err="failed to get container status \"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\": rpc error: code = NotFound desc = could not find container \"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\": container with ID starting with a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.653698 4911 scope.go:117] "RemoveContainer" containerID="a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64" Mar 10 14:16:01 crc kubenswrapper[4911]: E0310 14:16:01.654115 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\": container with ID starting with a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64 not found: ID does not exist" containerID="a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.654172 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64"} err="failed to get container status \"a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\": rpc error: code = NotFound desc = could not find container \"a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\": container with ID starting with a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.654213 4911 scope.go:117] "RemoveContainer" containerID="d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.654564 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803"} err="failed to get container status \"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803\": rpc error: code = NotFound desc = could not find container \"d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803\": container with ID starting with d9be99304ed60528109f5afdd6b128b436c6dcb0766cdea4a5ed3e2576321803 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.654594 4911 scope.go:117] "RemoveContainer" containerID="5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.654871 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39"} err="failed to get container status \"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\": rpc error: code = NotFound desc = could not find container \"5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39\": container with ID starting with 5892c59ebde6ee4dd430da3d8dccbe35714258669042475c629f622b77440c39 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.654893 4911 scope.go:117] "RemoveContainer" containerID="0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.655214 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89"} err="failed to get container status \"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\": rpc error: code = NotFound desc = could not find container \"0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89\": container with ID starting with 0781f770a9f57ff684223e2b64df32647c2cb41be9e5d1a5ecaf78f5044b6e89 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.655245 4911 scope.go:117] "RemoveContainer" containerID="55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.655500 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1"} err="failed to get container status \"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\": rpc error: code = NotFound desc = could not find container \"55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1\": container with ID starting with 55612d6a953e404371b31b1a3e3191210282b9be89019a6d8650f5d8c8ace1a1 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.655528 4911 scope.go:117] "RemoveContainer" containerID="b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.655934 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e"} err="failed to get container status \"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\": rpc error: code = NotFound desc = could not find container \"b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e\": container with ID starting with b2dc8226bfb32ce8e6599cf4ae5695b4ad23d3cb2794a9a656fb481414c55b0e not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.655994 4911 scope.go:117] "RemoveContainer" containerID="87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.656349 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f"} err="failed to get container status \"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\": rpc error: code = NotFound desc = could not find container \"87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f\": container with ID starting with 87c8e34cd6da51d8b51fe2916fee126b5bdeeee05168934272e5d46faf74970f not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.656373 4911 scope.go:117] "RemoveContainer" containerID="26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.656688 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173"} err="failed to get container status \"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\": rpc error: code = NotFound desc = could not find container \"26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173\": container with ID starting with 26ba5b593e1e3a30d3fe46ced8f5f31a67819ac6482431493723fe103093b173 not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.656715 4911 scope.go:117] "RemoveContainer" containerID="a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.657103 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e"} err="failed to get container status \"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\": rpc error: code = NotFound desc = could not find container \"a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e\": container with ID starting with a822d6899808ba20a0a73f60980b07b18679845b54d1bf303b3c205030cabf9e not found: ID does not exist" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.657128 4911 scope.go:117] "RemoveContainer" containerID="a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64" Mar 10 14:16:01 crc kubenswrapper[4911]: I0310 14:16:01.657415 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64"} err="failed to get container status \"a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\": rpc error: code = NotFound desc = could not find container \"a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64\": container with ID starting with a75f4b3d212c7d6a1930f3aff879700793002d73a65686c54a0dbd7da45efc64 not found: ID does not exist" Mar 10 14:16:02 crc kubenswrapper[4911]: I0310 14:16:02.213117 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2b430b-2281-4231-9135-f0289be08cdd" path="/var/lib/kubelet/pods/ed2b430b-2281-4231-9135-f0289be08cdd/volumes" Mar 10 14:16:02 crc kubenswrapper[4911]: I0310 14:16:02.397841 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"57fc7ce8b6e76bc2602a9035f3cf33443228367a0527a16eb012573f4c1bb318"} Mar 10 14:16:02 crc kubenswrapper[4911]: I0310 14:16:02.398227 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"2ae275c81ae65536348b205bf0f0bdfad37f8ef1501f5c9530504990cd310f27"} Mar 10 14:16:02 crc kubenswrapper[4911]: I0310 14:16:02.398473 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"45efef6991a5efceada85595cef717712bcfa480b7982ead157d2c57835288b7"} Mar 10 14:16:02 crc kubenswrapper[4911]: I0310 14:16:02.398581 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"e6f38916287c62e8ea87bc177aba2943acfb4316ce514d7b957cbf9bd9c7f86c"} Mar 10 14:16:02 crc kubenswrapper[4911]: I0310 14:16:02.398691 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"7e78ee26297ffa06ff3b58ab8465c2ffffe5981a15d21a529ae690136cc45f0e"} Mar 10 14:16:02 crc kubenswrapper[4911]: I0310 14:16:02.398816 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"813bb855bf0e17917c872984c513c176dba894aae9716970ff2c2d7c26c9d698"} Mar 10 14:16:04 crc kubenswrapper[4911]: I0310 14:16:04.421225 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"64d4dd3de818e77a7ba32236758155c871c07cd444702769bfdcec860876db8b"} Mar 10 14:16:07 crc kubenswrapper[4911]: I0310 14:16:07.448221 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" event={"ID":"4bf34236-ff13-45ec-8f92-c055f7151bab","Type":"ContainerStarted","Data":"dcae66bef4d69835becd319656769892f1cf23674f7561e52c5d415f52a3c839"} Mar 10 14:16:07 crc kubenswrapper[4911]: I0310 14:16:07.449379 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:07 crc kubenswrapper[4911]: I0310 14:16:07.449418 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:07 crc kubenswrapper[4911]: I0310 14:16:07.449445 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:07 crc kubenswrapper[4911]: I0310 14:16:07.495443 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" podStartSLOduration=7.495420354 podStartE2EDuration="7.495420354s" podCreationTimestamp="2026-03-10 14:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:16:07.49117148 +0000 UTC m=+872.054691457" watchObservedRunningTime="2026-03-10 14:16:07.495420354 +0000 UTC m=+872.058940261" Mar 10 14:16:07 crc kubenswrapper[4911]: I0310 14:16:07.509973 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:07 crc kubenswrapper[4911]: I0310 14:16:07.521147 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:08 crc kubenswrapper[4911]: I0310 14:16:08.073005 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552536-jnbwk"] Mar 10 14:16:08 crc kubenswrapper[4911]: I0310 14:16:08.073188 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:08 crc kubenswrapper[4911]: I0310 14:16:08.073678 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:08 crc kubenswrapper[4911]: E0310 14:16:08.111344 4911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552536-jnbwk_openshift-infra_af1f4b1e-ab3f-4d2b-b2d1-185188a4795c_0(7a3042d38ee391aa8b467f6d7860025ad0728f07b7f7d77c5f76015796cd3cca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 14:16:08 crc kubenswrapper[4911]: E0310 14:16:08.111436 4911 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552536-jnbwk_openshift-infra_af1f4b1e-ab3f-4d2b-b2d1-185188a4795c_0(7a3042d38ee391aa8b467f6d7860025ad0728f07b7f7d77c5f76015796cd3cca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:08 crc kubenswrapper[4911]: E0310 14:16:08.111466 4911 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552536-jnbwk_openshift-infra_af1f4b1e-ab3f-4d2b-b2d1-185188a4795c_0(7a3042d38ee391aa8b467f6d7860025ad0728f07b7f7d77c5f76015796cd3cca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:08 crc kubenswrapper[4911]: E0310 14:16:08.111523 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29552536-jnbwk_openshift-infra(af1f4b1e-ab3f-4d2b-b2d1-185188a4795c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29552536-jnbwk_openshift-infra(af1f4b1e-ab3f-4d2b-b2d1-185188a4795c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552536-jnbwk_openshift-infra_af1f4b1e-ab3f-4d2b-b2d1-185188a4795c_0(7a3042d38ee391aa8b467f6d7860025ad0728f07b7f7d77c5f76015796cd3cca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" podUID="af1f4b1e-ab3f-4d2b-b2d1-185188a4795c" Mar 10 14:16:22 crc kubenswrapper[4911]: I0310 14:16:22.192433 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:22 crc kubenswrapper[4911]: I0310 14:16:22.194440 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:22 crc kubenswrapper[4911]: I0310 14:16:22.436529 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552536-jnbwk"] Mar 10 14:16:22 crc kubenswrapper[4911]: I0310 14:16:22.550106 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" event={"ID":"af1f4b1e-ab3f-4d2b-b2d1-185188a4795c","Type":"ContainerStarted","Data":"5716a4090bd766ec665c8862b9d4dbb18dd4db85751ea97732197283d02ca963"} Mar 10 14:16:24 crc kubenswrapper[4911]: I0310 14:16:24.573986 4911 generic.go:334] "Generic (PLEG): container finished" podID="af1f4b1e-ab3f-4d2b-b2d1-185188a4795c" containerID="11a15641096229e69f468a09c68f7a8b99e011a182675b73fe8fecf1cbfaf421" exitCode=0 Mar 10 14:16:24 crc kubenswrapper[4911]: I0310 14:16:24.574070 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" event={"ID":"af1f4b1e-ab3f-4d2b-b2d1-185188a4795c","Type":"ContainerDied","Data":"11a15641096229e69f468a09c68f7a8b99e011a182675b73fe8fecf1cbfaf421"} Mar 10 14:16:25 crc kubenswrapper[4911]: I0310 14:16:25.839456 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:25 crc kubenswrapper[4911]: I0310 14:16:25.933884 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97wvs\" (UniqueName: \"kubernetes.io/projected/af1f4b1e-ab3f-4d2b-b2d1-185188a4795c-kube-api-access-97wvs\") pod \"af1f4b1e-ab3f-4d2b-b2d1-185188a4795c\" (UID: \"af1f4b1e-ab3f-4d2b-b2d1-185188a4795c\") " Mar 10 14:16:25 crc kubenswrapper[4911]: I0310 14:16:25.940840 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1f4b1e-ab3f-4d2b-b2d1-185188a4795c-kube-api-access-97wvs" (OuterVolumeSpecName: "kube-api-access-97wvs") pod "af1f4b1e-ab3f-4d2b-b2d1-185188a4795c" (UID: "af1f4b1e-ab3f-4d2b-b2d1-185188a4795c"). InnerVolumeSpecName "kube-api-access-97wvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:16:26 crc kubenswrapper[4911]: I0310 14:16:26.036185 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97wvs\" (UniqueName: \"kubernetes.io/projected/af1f4b1e-ab3f-4d2b-b2d1-185188a4795c-kube-api-access-97wvs\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:26 crc kubenswrapper[4911]: I0310 14:16:26.592251 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" event={"ID":"af1f4b1e-ab3f-4d2b-b2d1-185188a4795c","Type":"ContainerDied","Data":"5716a4090bd766ec665c8862b9d4dbb18dd4db85751ea97732197283d02ca963"} Mar 10 14:16:26 crc kubenswrapper[4911]: I0310 14:16:26.592319 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552536-jnbwk" Mar 10 14:16:26 crc kubenswrapper[4911]: I0310 14:16:26.592331 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5716a4090bd766ec665c8862b9d4dbb18dd4db85751ea97732197283d02ca963" Mar 10 14:16:26 crc kubenswrapper[4911]: I0310 14:16:26.904393 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552530-kshjj"] Mar 10 14:16:26 crc kubenswrapper[4911]: I0310 14:16:26.910273 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552530-kshjj"] Mar 10 14:16:28 crc kubenswrapper[4911]: I0310 14:16:28.203964 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71" path="/var/lib/kubelet/pods/d1a0b07d-ffe8-4ac6-861e-a8ddfcf83b71/volumes" Mar 10 14:16:31 crc kubenswrapper[4911]: I0310 14:16:31.130494 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5dct" Mar 10 14:16:36 crc kubenswrapper[4911]: I0310 14:16:36.850431 4911 scope.go:117] "RemoveContainer" containerID="33850a67294f29efe2211f3dd2ca090711bb1173a67d98d810cbf078e1695130" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.551458 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz"] Mar 10 14:16:41 crc kubenswrapper[4911]: E0310 14:16:41.552147 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1f4b1e-ab3f-4d2b-b2d1-185188a4795c" containerName="oc" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.552165 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1f4b1e-ab3f-4d2b-b2d1-185188a4795c" containerName="oc" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.552293 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1f4b1e-ab3f-4d2b-b2d1-185188a4795c" containerName="oc" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.553315 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.557577 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.561156 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz"] Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.661508 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.661563 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdq7w\" (UniqueName: \"kubernetes.io/projected/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-kube-api-access-mdq7w\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.661631 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.763704 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.763771 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdq7w\" (UniqueName: \"kubernetes.io/projected/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-kube-api-access-mdq7w\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.763821 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.764307 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.764905 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.782864 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdq7w\" (UniqueName: \"kubernetes.io/projected/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-kube-api-access-mdq7w\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:41 crc kubenswrapper[4911]: I0310 14:16:41.869029 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:42 crc kubenswrapper[4911]: W0310 14:16:42.202098 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb017921_d0db_4e33_b0aa_c02d2cf72ce3.slice/crio-9b444ef10dfccb1fbec18ccf5399221228bb25ad65f79dc178ea1cbc74ab6c80 WatchSource:0}: Error finding container 9b444ef10dfccb1fbec18ccf5399221228bb25ad65f79dc178ea1cbc74ab6c80: Status 404 returned error can't find the container with id 9b444ef10dfccb1fbec18ccf5399221228bb25ad65f79dc178ea1cbc74ab6c80 Mar 10 14:16:42 crc kubenswrapper[4911]: I0310 14:16:42.211581 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz"] Mar 10 14:16:42 crc kubenswrapper[4911]: I0310 14:16:42.711050 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" event={"ID":"eb017921-d0db-4e33-b0aa-c02d2cf72ce3","Type":"ContainerStarted","Data":"d37c09d5716c396f678e25427ba330c623db3f651bb821e451a1f48d6faee987"} Mar 10 14:16:42 crc kubenswrapper[4911]: I0310 14:16:42.711111 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" event={"ID":"eb017921-d0db-4e33-b0aa-c02d2cf72ce3","Type":"ContainerStarted","Data":"9b444ef10dfccb1fbec18ccf5399221228bb25ad65f79dc178ea1cbc74ab6c80"} Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.583984 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-flmss"] Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.585375 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.599481 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-flmss"] Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.692270 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-catalog-content\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.692670 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-utilities\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.692698 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p2jp\" (UniqueName: \"kubernetes.io/projected/f7d817af-31f5-46da-a21c-0796724162c8-kube-api-access-4p2jp\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.724919 4911 generic.go:334] "Generic (PLEG): container finished" podID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerID="d37c09d5716c396f678e25427ba330c623db3f651bb821e451a1f48d6faee987" exitCode=0 Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.724976 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" event={"ID":"eb017921-d0db-4e33-b0aa-c02d2cf72ce3","Type":"ContainerDied","Data":"d37c09d5716c396f678e25427ba330c623db3f651bb821e451a1f48d6faee987"} Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.794309 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-catalog-content\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.794371 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-utilities\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.794397 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p2jp\" (UniqueName: \"kubernetes.io/projected/f7d817af-31f5-46da-a21c-0796724162c8-kube-api-access-4p2jp\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.795637 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-catalog-content\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.795915 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-utilities\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.817795 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p2jp\" (UniqueName: \"kubernetes.io/projected/f7d817af-31f5-46da-a21c-0796724162c8-kube-api-access-4p2jp\") pod \"redhat-operators-flmss\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:43 crc kubenswrapper[4911]: I0310 14:16:43.907139 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:44 crc kubenswrapper[4911]: I0310 14:16:44.167348 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-flmss"] Mar 10 14:16:44 crc kubenswrapper[4911]: W0310 14:16:44.175353 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d817af_31f5_46da_a21c_0796724162c8.slice/crio-e857437356a9c2425387f321609c1f17c1033cc937f939c4119c994a4a7c0b82 WatchSource:0}: Error finding container e857437356a9c2425387f321609c1f17c1033cc937f939c4119c994a4a7c0b82: Status 404 returned error can't find the container with id e857437356a9c2425387f321609c1f17c1033cc937f939c4119c994a4a7c0b82 Mar 10 14:16:44 crc kubenswrapper[4911]: I0310 14:16:44.733274 4911 generic.go:334] "Generic (PLEG): container finished" podID="f7d817af-31f5-46da-a21c-0796724162c8" containerID="1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2" exitCode=0 Mar 10 14:16:44 crc kubenswrapper[4911]: I0310 14:16:44.733341 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flmss" event={"ID":"f7d817af-31f5-46da-a21c-0796724162c8","Type":"ContainerDied","Data":"1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2"} Mar 10 14:16:44 crc kubenswrapper[4911]: I0310 14:16:44.733386 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flmss" event={"ID":"f7d817af-31f5-46da-a21c-0796724162c8","Type":"ContainerStarted","Data":"e857437356a9c2425387f321609c1f17c1033cc937f939c4119c994a4a7c0b82"} Mar 10 14:16:45 crc kubenswrapper[4911]: I0310 14:16:45.746144 4911 generic.go:334] "Generic (PLEG): container finished" podID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerID="b4daebcfe20529ac9809a6f1ef6657cebb06930c7a8d4046ec8ef2687a9135aa" exitCode=0 Mar 10 14:16:45 crc kubenswrapper[4911]: I0310 14:16:45.748482 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" event={"ID":"eb017921-d0db-4e33-b0aa-c02d2cf72ce3","Type":"ContainerDied","Data":"b4daebcfe20529ac9809a6f1ef6657cebb06930c7a8d4046ec8ef2687a9135aa"} Mar 10 14:16:45 crc kubenswrapper[4911]: I0310 14:16:45.751572 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flmss" event={"ID":"f7d817af-31f5-46da-a21c-0796724162c8","Type":"ContainerStarted","Data":"ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9"} Mar 10 14:16:46 crc kubenswrapper[4911]: I0310 14:16:46.762465 4911 generic.go:334] "Generic (PLEG): container finished" podID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerID="0f950d7daba8762c05d2b387ecad81a0dd4ad15fdf08d9e206b71d9eb6833f3b" exitCode=0 Mar 10 14:16:46 crc kubenswrapper[4911]: I0310 14:16:46.762557 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" event={"ID":"eb017921-d0db-4e33-b0aa-c02d2cf72ce3","Type":"ContainerDied","Data":"0f950d7daba8762c05d2b387ecad81a0dd4ad15fdf08d9e206b71d9eb6833f3b"} Mar 10 14:16:47 crc kubenswrapper[4911]: I0310 14:16:47.779184 4911 generic.go:334] "Generic (PLEG): container finished" podID="f7d817af-31f5-46da-a21c-0796724162c8" containerID="ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9" exitCode=0 Mar 10 14:16:47 crc kubenswrapper[4911]: I0310 14:16:47.779272 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flmss" event={"ID":"f7d817af-31f5-46da-a21c-0796724162c8","Type":"ContainerDied","Data":"ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9"} Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.112112 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.274282 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdq7w\" (UniqueName: \"kubernetes.io/projected/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-kube-api-access-mdq7w\") pod \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.274456 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-util\") pod \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.274522 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-bundle\") pod \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\" (UID: \"eb017921-d0db-4e33-b0aa-c02d2cf72ce3\") " Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.275633 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-bundle" (OuterVolumeSpecName: "bundle") pod "eb017921-d0db-4e33-b0aa-c02d2cf72ce3" (UID: "eb017921-d0db-4e33-b0aa-c02d2cf72ce3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.285905 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-kube-api-access-mdq7w" (OuterVolumeSpecName: "kube-api-access-mdq7w") pod "eb017921-d0db-4e33-b0aa-c02d2cf72ce3" (UID: "eb017921-d0db-4e33-b0aa-c02d2cf72ce3"). InnerVolumeSpecName "kube-api-access-mdq7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.297481 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-util" (OuterVolumeSpecName: "util") pod "eb017921-d0db-4e33-b0aa-c02d2cf72ce3" (UID: "eb017921-d0db-4e33-b0aa-c02d2cf72ce3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.376710 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdq7w\" (UniqueName: \"kubernetes.io/projected/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-kube-api-access-mdq7w\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.376775 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-util\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.376791 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb017921-d0db-4e33-b0aa-c02d2cf72ce3-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.789894 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.791425 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz" event={"ID":"eb017921-d0db-4e33-b0aa-c02d2cf72ce3","Type":"ContainerDied","Data":"9b444ef10dfccb1fbec18ccf5399221228bb25ad65f79dc178ea1cbc74ab6c80"} Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.791613 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b444ef10dfccb1fbec18ccf5399221228bb25ad65f79dc178ea1cbc74ab6c80" Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.793207 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flmss" event={"ID":"f7d817af-31f5-46da-a21c-0796724162c8","Type":"ContainerStarted","Data":"b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf"} Mar 10 14:16:48 crc kubenswrapper[4911]: I0310 14:16:48.822113 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-flmss" podStartSLOduration=2.067621325 podStartE2EDuration="5.822086316s" podCreationTimestamp="2026-03-10 14:16:43 +0000 UTC" firstStartedPulling="2026-03-10 14:16:44.735885226 +0000 UTC m=+909.299405143" lastFinishedPulling="2026-03-10 14:16:48.490350217 +0000 UTC m=+913.053870134" observedRunningTime="2026-03-10 14:16:48.818983222 +0000 UTC m=+913.382503139" watchObservedRunningTime="2026-03-10 14:16:48.822086316 +0000 UTC m=+913.385606263" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.883659 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd"] Mar 10 14:16:51 crc kubenswrapper[4911]: E0310 14:16:51.884498 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerName="util" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.884521 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerName="util" Mar 10 14:16:51 crc kubenswrapper[4911]: E0310 14:16:51.884535 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerName="extract" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.884544 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerName="extract" Mar 10 14:16:51 crc kubenswrapper[4911]: E0310 14:16:51.884558 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerName="pull" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.884565 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerName="pull" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.884698 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb017921-d0db-4e33-b0aa-c02d2cf72ce3" containerName="extract" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.885323 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.887778 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-h6nvz" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.887936 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.888013 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 14:16:51 crc kubenswrapper[4911]: I0310 14:16:51.896661 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd"] Mar 10 14:16:52 crc kubenswrapper[4911]: I0310 14:16:52.034875 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh8k2\" (UniqueName: \"kubernetes.io/projected/fdc52c27-9268-47bd-b07e-8d9995db81bb-kube-api-access-xh8k2\") pod \"nmstate-operator-75c5dccd6c-rdddd\" (UID: \"fdc52c27-9268-47bd-b07e-8d9995db81bb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd" Mar 10 14:16:52 crc kubenswrapper[4911]: I0310 14:16:52.135869 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh8k2\" (UniqueName: \"kubernetes.io/projected/fdc52c27-9268-47bd-b07e-8d9995db81bb-kube-api-access-xh8k2\") pod \"nmstate-operator-75c5dccd6c-rdddd\" (UID: \"fdc52c27-9268-47bd-b07e-8d9995db81bb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd" Mar 10 14:16:52 crc kubenswrapper[4911]: I0310 14:16:52.153800 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh8k2\" (UniqueName: \"kubernetes.io/projected/fdc52c27-9268-47bd-b07e-8d9995db81bb-kube-api-access-xh8k2\") pod \"nmstate-operator-75c5dccd6c-rdddd\" (UID: \"fdc52c27-9268-47bd-b07e-8d9995db81bb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd" Mar 10 14:16:52 crc kubenswrapper[4911]: I0310 14:16:52.204226 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd" Mar 10 14:16:52 crc kubenswrapper[4911]: I0310 14:16:52.451840 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd"] Mar 10 14:16:52 crc kubenswrapper[4911]: I0310 14:16:52.821301 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd" event={"ID":"fdc52c27-9268-47bd-b07e-8d9995db81bb","Type":"ContainerStarted","Data":"c6817474597cfd0b3f4b1901d90de65c43557efa6ab7442bad01e31cb4a7b585"} Mar 10 14:16:53 crc kubenswrapper[4911]: I0310 14:16:53.908193 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:53 crc kubenswrapper[4911]: I0310 14:16:53.908500 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:16:54 crc kubenswrapper[4911]: I0310 14:16:54.962384 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-flmss" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="registry-server" probeResult="failure" output=< Mar 10 14:16:54 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 14:16:54 crc kubenswrapper[4911]: > Mar 10 14:16:55 crc kubenswrapper[4911]: I0310 14:16:55.844346 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd" event={"ID":"fdc52c27-9268-47bd-b07e-8d9995db81bb","Type":"ContainerStarted","Data":"c01fb9b02a233b94c28fc29816d8f15003fc4eb2ed8e8de0f8f3abd53a00f27e"} Mar 10 14:16:55 crc kubenswrapper[4911]: I0310 14:16:55.868357 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rdddd" podStartSLOduration=2.034247293 podStartE2EDuration="4.868325265s" podCreationTimestamp="2026-03-10 14:16:51 +0000 UTC" firstStartedPulling="2026-03-10 14:16:52.460792823 +0000 UTC m=+917.024312740" lastFinishedPulling="2026-03-10 14:16:55.294870775 +0000 UTC m=+919.858390712" observedRunningTime="2026-03-10 14:16:55.866148656 +0000 UTC m=+920.429668573" watchObservedRunningTime="2026-03-10 14:16:55.868325265 +0000 UTC m=+920.431845182" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.711244 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-hj6l2"] Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.712895 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.717066 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4ffxg" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.720289 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg"] Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.721336 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.724678 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.736547 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg"] Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.751244 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-v89h8"] Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.753918 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.782454 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-hj6l2"] Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.859652 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk"] Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.860437 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.862980 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.863202 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.865864 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-wkfl6" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.874238 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-ovs-socket\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.874297 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8vwl\" (UniqueName: \"kubernetes.io/projected/9d21e056-f31e-4a31-a5fe-1543fd7dbc98-kube-api-access-d8vwl\") pod \"nmstate-webhook-786f45cff4-rr2pg\" (UID: \"9d21e056-f31e-4a31-a5fe-1543fd7dbc98\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.874337 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8znml\" (UniqueName: \"kubernetes.io/projected/85141c5d-b88f-4970-a62f-e826726facc1-kube-api-access-8znml\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.874358 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9d21e056-f31e-4a31-a5fe-1543fd7dbc98-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-rr2pg\" (UID: \"9d21e056-f31e-4a31-a5fe-1543fd7dbc98\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.874404 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hctxn\" (UniqueName: \"kubernetes.io/projected/892d43d8-c12a-47b3-8056-d0d6024e961e-kube-api-access-hctxn\") pod \"nmstate-metrics-69594cc75-hj6l2\" (UID: \"892d43d8-c12a-47b3-8056-d0d6024e961e\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.874433 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-dbus-socket\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.874484 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-nmstate-lock\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:01 crc kubenswrapper[4911]: I0310 14:17:01.909370 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk"] Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002059 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002111 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dq2v\" (UniqueName: \"kubernetes.io/projected/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-kube-api-access-7dq2v\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002157 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-ovs-socket\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002179 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8vwl\" (UniqueName: \"kubernetes.io/projected/9d21e056-f31e-4a31-a5fe-1543fd7dbc98-kube-api-access-d8vwl\") pod \"nmstate-webhook-786f45cff4-rr2pg\" (UID: \"9d21e056-f31e-4a31-a5fe-1543fd7dbc98\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002202 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8znml\" (UniqueName: \"kubernetes.io/projected/85141c5d-b88f-4970-a62f-e826726facc1-kube-api-access-8znml\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002220 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9d21e056-f31e-4a31-a5fe-1543fd7dbc98-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-rr2pg\" (UID: \"9d21e056-f31e-4a31-a5fe-1543fd7dbc98\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002251 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hctxn\" (UniqueName: \"kubernetes.io/projected/892d43d8-c12a-47b3-8056-d0d6024e961e-kube-api-access-hctxn\") pod \"nmstate-metrics-69594cc75-hj6l2\" (UID: \"892d43d8-c12a-47b3-8056-d0d6024e961e\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002276 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-dbus-socket\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002316 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-nmstate-lock\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002339 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.002447 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-ovs-socket\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.003830 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-dbus-socket\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.004024 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/85141c5d-b88f-4970-a62f-e826726facc1-nmstate-lock\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.023425 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9d21e056-f31e-4a31-a5fe-1543fd7dbc98-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-rr2pg\" (UID: \"9d21e056-f31e-4a31-a5fe-1543fd7dbc98\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.027362 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hctxn\" (UniqueName: \"kubernetes.io/projected/892d43d8-c12a-47b3-8056-d0d6024e961e-kube-api-access-hctxn\") pod \"nmstate-metrics-69594cc75-hj6l2\" (UID: \"892d43d8-c12a-47b3-8056-d0d6024e961e\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.028173 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8vwl\" (UniqueName: \"kubernetes.io/projected/9d21e056-f31e-4a31-a5fe-1543fd7dbc98-kube-api-access-d8vwl\") pod \"nmstate-webhook-786f45cff4-rr2pg\" (UID: \"9d21e056-f31e-4a31-a5fe-1543fd7dbc98\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.029671 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8znml\" (UniqueName: \"kubernetes.io/projected/85141c5d-b88f-4970-a62f-e826726facc1-kube-api-access-8znml\") pod \"nmstate-handler-v89h8\" (UID: \"85141c5d-b88f-4970-a62f-e826726facc1\") " pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.038422 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.048085 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.073045 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.103224 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.103283 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dq2v\" (UniqueName: \"kubernetes.io/projected/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-kube-api-access-7dq2v\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.103325 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.134610 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.151770 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dq2v\" (UniqueName: \"kubernetes.io/projected/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-kube-api-access-7dq2v\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.153681 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.163825 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f47c96fb4-9hh74"] Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.167567 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.179346 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f47c96fb4-9hh74"] Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.200889 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9cf4f47-5446-49ed-95d1-b3e9322ce43e-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-dw2wk\" (UID: \"c9cf4f47-5446-49ed-95d1-b3e9322ce43e\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.315374 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-console-config\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.326310 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-trusted-ca-bundle\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.326485 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8fc041-4023-447a-83b2-c221de50c8f3-console-serving-cert\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.326583 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b8fc041-4023-447a-83b2-c221de50c8f3-console-oauth-config\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.326638 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-666kh\" (UniqueName: \"kubernetes.io/projected/4b8fc041-4023-447a-83b2-c221de50c8f3-kube-api-access-666kh\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.326763 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-service-ca\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.326835 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-oauth-serving-cert\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.427305 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8fc041-4023-447a-83b2-c221de50c8f3-console-serving-cert\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.427363 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b8fc041-4023-447a-83b2-c221de50c8f3-console-oauth-config\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.427394 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-666kh\" (UniqueName: \"kubernetes.io/projected/4b8fc041-4023-447a-83b2-c221de50c8f3-kube-api-access-666kh\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.427425 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-service-ca\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.427451 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-oauth-serving-cert\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.427491 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-console-config\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.427538 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-trusted-ca-bundle\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.428678 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-trusted-ca-bundle\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.429642 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-service-ca\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.430266 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-console-config\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.432615 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b8fc041-4023-447a-83b2-c221de50c8f3-oauth-serving-cert\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.435659 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8fc041-4023-447a-83b2-c221de50c8f3-console-serving-cert\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.440870 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-hj6l2"] Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.441071 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b8fc041-4023-447a-83b2-c221de50c8f3-console-oauth-config\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: W0310 14:17:02.453256 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod892d43d8_c12a_47b3_8056_d0d6024e961e.slice/crio-2b104c6eb31f1d9c685e9f8dce0f1a30b53df2eaf02fa1c2613ad8bbd79a59f4 WatchSource:0}: Error finding container 2b104c6eb31f1d9c685e9f8dce0f1a30b53df2eaf02fa1c2613ad8bbd79a59f4: Status 404 returned error can't find the container with id 2b104c6eb31f1d9c685e9f8dce0f1a30b53df2eaf02fa1c2613ad8bbd79a59f4 Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.454827 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-666kh\" (UniqueName: \"kubernetes.io/projected/4b8fc041-4023-447a-83b2-c221de50c8f3-kube-api-access-666kh\") pod \"console-f47c96fb4-9hh74\" (UID: \"4b8fc041-4023-447a-83b2-c221de50c8f3\") " pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.478303 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.482381 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg"] Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.486755 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:02 crc kubenswrapper[4911]: W0310 14:17:02.492022 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d21e056_f31e_4a31_a5fe_1543fd7dbc98.slice/crio-7462be0b1df14e0e0768c86082dcff8bebd751acf5d0fb4dbd6ec20ddb2db921 WatchSource:0}: Error finding container 7462be0b1df14e0e0768c86082dcff8bebd751acf5d0fb4dbd6ec20ddb2db921: Status 404 returned error can't find the container with id 7462be0b1df14e0e0768c86082dcff8bebd751acf5d0fb4dbd6ec20ddb2db921 Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.697429 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk"] Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.749898 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f47c96fb4-9hh74"] Mar 10 14:17:02 crc kubenswrapper[4911]: W0310 14:17:02.759980 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b8fc041_4023_447a_83b2_c221de50c8f3.slice/crio-ae818497700cebbeda83a561ff9ed99c8ef6c168ef3c5c2ee880769d040af567 WatchSource:0}: Error finding container ae818497700cebbeda83a561ff9ed99c8ef6c168ef3c5c2ee880769d040af567: Status 404 returned error can't find the container with id ae818497700cebbeda83a561ff9ed99c8ef6c168ef3c5c2ee880769d040af567 Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.907177 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" event={"ID":"9d21e056-f31e-4a31-a5fe-1543fd7dbc98","Type":"ContainerStarted","Data":"7462be0b1df14e0e0768c86082dcff8bebd751acf5d0fb4dbd6ec20ddb2db921"} Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.908199 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" event={"ID":"892d43d8-c12a-47b3-8056-d0d6024e961e","Type":"ContainerStarted","Data":"2b104c6eb31f1d9c685e9f8dce0f1a30b53df2eaf02fa1c2613ad8bbd79a59f4"} Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.910080 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f47c96fb4-9hh74" event={"ID":"4b8fc041-4023-447a-83b2-c221de50c8f3","Type":"ContainerStarted","Data":"aac458d4d81319fbb9d5e0a5e7baad695093ea4220b253e89ef861e0f2f708b9"} Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.910112 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f47c96fb4-9hh74" event={"ID":"4b8fc041-4023-447a-83b2-c221de50c8f3","Type":"ContainerStarted","Data":"ae818497700cebbeda83a561ff9ed99c8ef6c168ef3c5c2ee880769d040af567"} Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.911103 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-v89h8" event={"ID":"85141c5d-b88f-4970-a62f-e826726facc1","Type":"ContainerStarted","Data":"461816cbe4a1e2654737824e6018264171e7e34225c291fa16d067b5bcecc7e0"} Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.912501 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" event={"ID":"c9cf4f47-5446-49ed-95d1-b3e9322ce43e","Type":"ContainerStarted","Data":"fa5c83e0008360688e51ad9214351c5e2977305c54b215820c5efef5d7f3b55d"} Mar 10 14:17:02 crc kubenswrapper[4911]: I0310 14:17:02.937696 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f47c96fb4-9hh74" podStartSLOduration=0.937659297 podStartE2EDuration="937.659297ms" podCreationTimestamp="2026-03-10 14:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:17:02.926373403 +0000 UTC m=+927.489893330" watchObservedRunningTime="2026-03-10 14:17:02.937659297 +0000 UTC m=+927.501179214" Mar 10 14:17:03 crc kubenswrapper[4911]: I0310 14:17:03.957082 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:17:04 crc kubenswrapper[4911]: I0310 14:17:04.006869 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:17:04 crc kubenswrapper[4911]: I0310 14:17:04.188617 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-flmss"] Mar 10 14:17:05 crc kubenswrapper[4911]: I0310 14:17:05.932177 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-flmss" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="registry-server" containerID="cri-o://b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf" gracePeriod=2 Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.464187 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.602916 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-catalog-content\") pod \"f7d817af-31f5-46da-a21c-0796724162c8\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.603002 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-utilities\") pod \"f7d817af-31f5-46da-a21c-0796724162c8\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.603030 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p2jp\" (UniqueName: \"kubernetes.io/projected/f7d817af-31f5-46da-a21c-0796724162c8-kube-api-access-4p2jp\") pod \"f7d817af-31f5-46da-a21c-0796724162c8\" (UID: \"f7d817af-31f5-46da-a21c-0796724162c8\") " Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.604598 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-utilities" (OuterVolumeSpecName: "utilities") pod "f7d817af-31f5-46da-a21c-0796724162c8" (UID: "f7d817af-31f5-46da-a21c-0796724162c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.611276 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d817af-31f5-46da-a21c-0796724162c8-kube-api-access-4p2jp" (OuterVolumeSpecName: "kube-api-access-4p2jp") pod "f7d817af-31f5-46da-a21c-0796724162c8" (UID: "f7d817af-31f5-46da-a21c-0796724162c8"). InnerVolumeSpecName "kube-api-access-4p2jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.704361 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.704645 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p2jp\" (UniqueName: \"kubernetes.io/projected/f7d817af-31f5-46da-a21c-0796724162c8-kube-api-access-4p2jp\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.731278 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7d817af-31f5-46da-a21c-0796724162c8" (UID: "f7d817af-31f5-46da-a21c-0796724162c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.805665 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d817af-31f5-46da-a21c-0796724162c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.939165 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-v89h8" event={"ID":"85141c5d-b88f-4970-a62f-e826726facc1","Type":"ContainerStarted","Data":"6c51fb9fadc0699d9bb70771c151009ad1d5f7857c0b5fb2f6a7b5dfc1aa7816"} Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.939837 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.940367 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" event={"ID":"c9cf4f47-5446-49ed-95d1-b3e9322ce43e","Type":"ContainerStarted","Data":"6696e2c24c29fb6656faa3ed4410b62880c94cad385f71a010b8f3bb0b9f0be7"} Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.941546 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" event={"ID":"9d21e056-f31e-4a31-a5fe-1543fd7dbc98","Type":"ContainerStarted","Data":"ebe340b252cf38fa2ea9b9df4e715b8956879d40bb8cc8652b112568855400a0"} Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.942277 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.943498 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" event={"ID":"892d43d8-c12a-47b3-8056-d0d6024e961e","Type":"ContainerStarted","Data":"62349de45bb9f7f945d7f70ab3a17365e567d89d08432ee74a80d6bab8d301d6"} Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.945178 4911 generic.go:334] "Generic (PLEG): container finished" podID="f7d817af-31f5-46da-a21c-0796724162c8" containerID="b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf" exitCode=0 Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.945228 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flmss" event={"ID":"f7d817af-31f5-46da-a21c-0796724162c8","Type":"ContainerDied","Data":"b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf"} Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.945261 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flmss" event={"ID":"f7d817af-31f5-46da-a21c-0796724162c8","Type":"ContainerDied","Data":"e857437356a9c2425387f321609c1f17c1033cc937f939c4119c994a4a7c0b82"} Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.945294 4911 scope.go:117] "RemoveContainer" containerID="b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.945457 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flmss" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.982672 4911 scope.go:117] "RemoveContainer" containerID="ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9" Mar 10 14:17:06 crc kubenswrapper[4911]: I0310 14:17:06.989821 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-v89h8" podStartSLOduration=1.806051487 podStartE2EDuration="5.989791878s" podCreationTimestamp="2026-03-10 14:17:01 +0000 UTC" firstStartedPulling="2026-03-10 14:17:02.153379211 +0000 UTC m=+926.716899128" lastFinishedPulling="2026-03-10 14:17:06.337119612 +0000 UTC m=+930.900639519" observedRunningTime="2026-03-10 14:17:06.987550497 +0000 UTC m=+931.551070414" watchObservedRunningTime="2026-03-10 14:17:06.989791878 +0000 UTC m=+931.553311795" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.007022 4911 scope.go:117] "RemoveContainer" containerID="1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.014609 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-dw2wk" podStartSLOduration=2.391804427 podStartE2EDuration="6.014577756s" podCreationTimestamp="2026-03-10 14:17:01 +0000 UTC" firstStartedPulling="2026-03-10 14:17:02.70423908 +0000 UTC m=+927.267758997" lastFinishedPulling="2026-03-10 14:17:06.327012399 +0000 UTC m=+930.890532326" observedRunningTime="2026-03-10 14:17:07.009095168 +0000 UTC m=+931.572615125" watchObservedRunningTime="2026-03-10 14:17:07.014577756 +0000 UTC m=+931.578097673" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.032954 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-flmss"] Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.040818 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-flmss"] Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.049269 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" podStartSLOduration=2.2185258230000002 podStartE2EDuration="6.049243951s" podCreationTimestamp="2026-03-10 14:17:01 +0000 UTC" firstStartedPulling="2026-03-10 14:17:02.505613402 +0000 UTC m=+927.069133319" lastFinishedPulling="2026-03-10 14:17:06.33633153 +0000 UTC m=+930.899851447" observedRunningTime="2026-03-10 14:17:07.045987274 +0000 UTC m=+931.609507201" watchObservedRunningTime="2026-03-10 14:17:07.049243951 +0000 UTC m=+931.612763878" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.053462 4911 scope.go:117] "RemoveContainer" containerID="b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf" Mar 10 14:17:07 crc kubenswrapper[4911]: E0310 14:17:07.054089 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf\": container with ID starting with b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf not found: ID does not exist" containerID="b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.054150 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf"} err="failed to get container status \"b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf\": rpc error: code = NotFound desc = could not find container \"b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf\": container with ID starting with b9f5c16bb63a51cdca698689f6c0115a013d4513be07f4b8ea93ea6af69ebadf not found: ID does not exist" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.054187 4911 scope.go:117] "RemoveContainer" containerID="ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9" Mar 10 14:17:07 crc kubenswrapper[4911]: E0310 14:17:07.054588 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9\": container with ID starting with ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9 not found: ID does not exist" containerID="ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.054624 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9"} err="failed to get container status \"ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9\": rpc error: code = NotFound desc = could not find container \"ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9\": container with ID starting with ebeecdee683b9b40f0280682c0f29182ebae8332d5e3c5b2ad73f5e2448f4ca9 not found: ID does not exist" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.054643 4911 scope.go:117] "RemoveContainer" containerID="1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2" Mar 10 14:17:07 crc kubenswrapper[4911]: E0310 14:17:07.055132 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2\": container with ID starting with 1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2 not found: ID does not exist" containerID="1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2" Mar 10 14:17:07 crc kubenswrapper[4911]: I0310 14:17:07.055174 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2"} err="failed to get container status \"1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2\": rpc error: code = NotFound desc = could not find container \"1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2\": container with ID starting with 1d66989ee941491e2b5dbe2e2e2383b3c521503e1cc1357736d19becb649bbe2 not found: ID does not exist" Mar 10 14:17:08 crc kubenswrapper[4911]: I0310 14:17:08.201693 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d817af-31f5-46da-a21c-0796724162c8" path="/var/lib/kubelet/pods/f7d817af-31f5-46da-a21c-0796724162c8/volumes" Mar 10 14:17:09 crc kubenswrapper[4911]: I0310 14:17:09.971755 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" event={"ID":"892d43d8-c12a-47b3-8056-d0d6024e961e","Type":"ContainerStarted","Data":"ab0b473281f0c7d39d23e095715aff53cdd9640710f30b56c7a1aef37e21da6f"} Mar 10 14:17:12 crc kubenswrapper[4911]: I0310 14:17:12.106072 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-v89h8" Mar 10 14:17:12 crc kubenswrapper[4911]: I0310 14:17:12.130231 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-hj6l2" podStartSLOduration=4.692453191 podStartE2EDuration="11.130204647s" podCreationTimestamp="2026-03-10 14:17:01 +0000 UTC" firstStartedPulling="2026-03-10 14:17:02.456170218 +0000 UTC m=+927.019690135" lastFinishedPulling="2026-03-10 14:17:08.893921674 +0000 UTC m=+933.457441591" observedRunningTime="2026-03-10 14:17:09.995297985 +0000 UTC m=+934.558817902" watchObservedRunningTime="2026-03-10 14:17:12.130204647 +0000 UTC m=+936.693724574" Mar 10 14:17:12 crc kubenswrapper[4911]: I0310 14:17:12.487191 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:12 crc kubenswrapper[4911]: I0310 14:17:12.487257 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:12 crc kubenswrapper[4911]: I0310 14:17:12.492498 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:13 crc kubenswrapper[4911]: I0310 14:17:13.017852 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f47c96fb4-9hh74" Mar 10 14:17:13 crc kubenswrapper[4911]: I0310 14:17:13.094605 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lpjl7"] Mar 10 14:17:22 crc kubenswrapper[4911]: I0310 14:17:22.058197 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-rr2pg" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.338966 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6"] Mar 10 14:17:35 crc kubenswrapper[4911]: E0310 14:17:35.339768 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="extract-content" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.339782 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="extract-content" Mar 10 14:17:35 crc kubenswrapper[4911]: E0310 14:17:35.339795 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="extract-utilities" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.339801 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="extract-utilities" Mar 10 14:17:35 crc kubenswrapper[4911]: E0310 14:17:35.339814 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="registry-server" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.339820 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="registry-server" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.339927 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d817af-31f5-46da-a21c-0796724162c8" containerName="registry-server" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.340799 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.343379 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.362926 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6"] Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.473459 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.473514 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.473537 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrn4q\" (UniqueName: \"kubernetes.io/projected/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-kube-api-access-xrn4q\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.575503 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrn4q\" (UniqueName: \"kubernetes.io/projected/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-kube-api-access-xrn4q\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.575565 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.575666 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.576162 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.576750 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.609511 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrn4q\" (UniqueName: \"kubernetes.io/projected/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-kube-api-access-xrn4q\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:35 crc kubenswrapper[4911]: I0310 14:17:35.656227 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:36 crc kubenswrapper[4911]: I0310 14:17:36.139211 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6"] Mar 10 14:17:36 crc kubenswrapper[4911]: I0310 14:17:36.669890 4911 generic.go:334] "Generic (PLEG): container finished" podID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerID="71be23ff5dfe933e8bb1d53d7c9653f78bc1df4d3e4d8340a554fd5d6253c675" exitCode=0 Mar 10 14:17:36 crc kubenswrapper[4911]: I0310 14:17:36.670045 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" event={"ID":"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3","Type":"ContainerDied","Data":"71be23ff5dfe933e8bb1d53d7c9653f78bc1df4d3e4d8340a554fd5d6253c675"} Mar 10 14:17:36 crc kubenswrapper[4911]: I0310 14:17:36.673282 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" event={"ID":"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3","Type":"ContainerStarted","Data":"c2784f847beef5b1c2823e94d83d9018c1517285f96f753c56c59c1a1bb4a8d1"} Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.155310 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lpjl7" podUID="28f8f1c7-122d-47a4-8de8-90db75c3365b" containerName="console" containerID="cri-o://bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724" gracePeriod=15 Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.563403 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lpjl7_28f8f1c7-122d-47a4-8de8-90db75c3365b/console/0.log" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.563757 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.686126 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lpjl7_28f8f1c7-122d-47a4-8de8-90db75c3365b/console/0.log" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.686180 4911 generic.go:334] "Generic (PLEG): container finished" podID="28f8f1c7-122d-47a4-8de8-90db75c3365b" containerID="bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724" exitCode=2 Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.686238 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lpjl7" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.686241 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lpjl7" event={"ID":"28f8f1c7-122d-47a4-8de8-90db75c3365b","Type":"ContainerDied","Data":"bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724"} Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.686353 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lpjl7" event={"ID":"28f8f1c7-122d-47a4-8de8-90db75c3365b","Type":"ContainerDied","Data":"6b5a6cc00b38085d5ef888ddedb3a6abb462c75175925d945c102fb5bfce577b"} Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.686374 4911 scope.go:117] "RemoveContainer" containerID="bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.689241 4911 generic.go:334] "Generic (PLEG): container finished" podID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerID="50a27d8e05fad95aea7ea0476defb79cff8471989ab5271b0bf0bb130fd4c5ad" exitCode=0 Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.689320 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" event={"ID":"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3","Type":"ContainerDied","Data":"50a27d8e05fad95aea7ea0476defb79cff8471989ab5271b0bf0bb130fd4c5ad"} Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.721405 4911 scope.go:117] "RemoveContainer" containerID="bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724" Mar 10 14:17:38 crc kubenswrapper[4911]: E0310 14:17:38.721919 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724\": container with ID starting with bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724 not found: ID does not exist" containerID="bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.722006 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724"} err="failed to get container status \"bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724\": rpc error: code = NotFound desc = could not find container \"bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724\": container with ID starting with bc8397cffe9aead29b0f7412ca6e9057e368012c78e88e8677f69e789e1e9724 not found: ID does not exist" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.738808 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-config\") pod \"28f8f1c7-122d-47a4-8de8-90db75c3365b\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.739123 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnqgx\" (UniqueName: \"kubernetes.io/projected/28f8f1c7-122d-47a4-8de8-90db75c3365b-kube-api-access-fnqgx\") pod \"28f8f1c7-122d-47a4-8de8-90db75c3365b\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.739194 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-trusted-ca-bundle\") pod \"28f8f1c7-122d-47a4-8de8-90db75c3365b\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.739238 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-oauth-serving-cert\") pod \"28f8f1c7-122d-47a4-8de8-90db75c3365b\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.739294 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-serving-cert\") pod \"28f8f1c7-122d-47a4-8de8-90db75c3365b\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.739314 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-oauth-config\") pod \"28f8f1c7-122d-47a4-8de8-90db75c3365b\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.739347 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-service-ca\") pod \"28f8f1c7-122d-47a4-8de8-90db75c3365b\" (UID: \"28f8f1c7-122d-47a4-8de8-90db75c3365b\") " Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.739899 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "28f8f1c7-122d-47a4-8de8-90db75c3365b" (UID: "28f8f1c7-122d-47a4-8de8-90db75c3365b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.739934 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-config" (OuterVolumeSpecName: "console-config") pod "28f8f1c7-122d-47a4-8de8-90db75c3365b" (UID: "28f8f1c7-122d-47a4-8de8-90db75c3365b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.740296 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "28f8f1c7-122d-47a4-8de8-90db75c3365b" (UID: "28f8f1c7-122d-47a4-8de8-90db75c3365b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.740384 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-service-ca" (OuterVolumeSpecName: "service-ca") pod "28f8f1c7-122d-47a4-8de8-90db75c3365b" (UID: "28f8f1c7-122d-47a4-8de8-90db75c3365b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.745315 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "28f8f1c7-122d-47a4-8de8-90db75c3365b" (UID: "28f8f1c7-122d-47a4-8de8-90db75c3365b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.745400 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "28f8f1c7-122d-47a4-8de8-90db75c3365b" (UID: "28f8f1c7-122d-47a4-8de8-90db75c3365b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.745737 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f8f1c7-122d-47a4-8de8-90db75c3365b-kube-api-access-fnqgx" (OuterVolumeSpecName: "kube-api-access-fnqgx") pod "28f8f1c7-122d-47a4-8de8-90db75c3365b" (UID: "28f8f1c7-122d-47a4-8de8-90db75c3365b"). InnerVolumeSpecName "kube-api-access-fnqgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.840330 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.840373 4911 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.840382 4911 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.840393 4911 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.840402 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.840411 4911 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28f8f1c7-122d-47a4-8de8-90db75c3365b-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:38 crc kubenswrapper[4911]: I0310 14:17:38.840420 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnqgx\" (UniqueName: \"kubernetes.io/projected/28f8f1c7-122d-47a4-8de8-90db75c3365b-kube-api-access-fnqgx\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:39 crc kubenswrapper[4911]: I0310 14:17:39.017814 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lpjl7"] Mar 10 14:17:39 crc kubenswrapper[4911]: I0310 14:17:39.021405 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lpjl7"] Mar 10 14:17:39 crc kubenswrapper[4911]: I0310 14:17:39.699057 4911 generic.go:334] "Generic (PLEG): container finished" podID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerID="b5f5316b7c88f6068f6b5d776f6bf5087adcf1fc21f3b89625542f47a6893db8" exitCode=0 Mar 10 14:17:39 crc kubenswrapper[4911]: I0310 14:17:39.699129 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" event={"ID":"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3","Type":"ContainerDied","Data":"b5f5316b7c88f6068f6b5d776f6bf5087adcf1fc21f3b89625542f47a6893db8"} Mar 10 14:17:40 crc kubenswrapper[4911]: I0310 14:17:40.205123 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f8f1c7-122d-47a4-8de8-90db75c3365b" path="/var/lib/kubelet/pods/28f8f1c7-122d-47a4-8de8-90db75c3365b/volumes" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.003801 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.111147 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-bundle\") pod \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.111312 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-util\") pod \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.111379 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrn4q\" (UniqueName: \"kubernetes.io/projected/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-kube-api-access-xrn4q\") pod \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\" (UID: \"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3\") " Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.112306 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-bundle" (OuterVolumeSpecName: "bundle") pod "4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" (UID: "4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.119594 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-kube-api-access-xrn4q" (OuterVolumeSpecName: "kube-api-access-xrn4q") pod "4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" (UID: "4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3"). InnerVolumeSpecName "kube-api-access-xrn4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.131540 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-util" (OuterVolumeSpecName: "util") pod "4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" (UID: "4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.214207 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrn4q\" (UniqueName: \"kubernetes.io/projected/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-kube-api-access-xrn4q\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.214240 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.214251 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3-util\") on node \"crc\" DevicePath \"\"" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.716773 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" event={"ID":"4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3","Type":"ContainerDied","Data":"c2784f847beef5b1c2823e94d83d9018c1517285f96f753c56c59c1a1bb4a8d1"} Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.716839 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2784f847beef5b1c2823e94d83d9018c1517285f96f753c56c59c1a1bb4a8d1" Mar 10 14:17:41 crc kubenswrapper[4911]: I0310 14:17:41.716844 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6" Mar 10 14:17:48 crc kubenswrapper[4911]: I0310 14:17:48.521181 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:17:48 crc kubenswrapper[4911]: I0310 14:17:48.521783 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.819296 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4"] Mar 10 14:17:50 crc kubenswrapper[4911]: E0310 14:17:50.819789 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerName="extract" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.819802 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerName="extract" Mar 10 14:17:50 crc kubenswrapper[4911]: E0310 14:17:50.819817 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f8f1c7-122d-47a4-8de8-90db75c3365b" containerName="console" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.819823 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f8f1c7-122d-47a4-8de8-90db75c3365b" containerName="console" Mar 10 14:17:50 crc kubenswrapper[4911]: E0310 14:17:50.819847 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerName="pull" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.819853 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerName="pull" Mar 10 14:17:50 crc kubenswrapper[4911]: E0310 14:17:50.819868 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerName="util" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.819874 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerName="util" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.819982 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3" containerName="extract" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.819991 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f8f1c7-122d-47a4-8de8-90db75c3365b" containerName="console" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.820412 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.829115 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7vbxr" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.829329 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.829813 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.829895 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.831523 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.840038 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4"] Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.850268 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbg2\" (UniqueName: \"kubernetes.io/projected/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-kube-api-access-9lbg2\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.850309 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-webhook-cert\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.850331 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-apiservice-cert\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.951176 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbg2\" (UniqueName: \"kubernetes.io/projected/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-kube-api-access-9lbg2\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.951485 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-webhook-cert\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.951557 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-apiservice-cert\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.959899 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-apiservice-cert\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.970493 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbg2\" (UniqueName: \"kubernetes.io/projected/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-kube-api-access-9lbg2\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:50 crc kubenswrapper[4911]: I0310 14:17:50.971813 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ab11607-0b3a-4d76-bb7d-34fd3c9fa271-webhook-cert\") pod \"metallb-operator-controller-manager-6d5b96ddc6-xbpp4\" (UID: \"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271\") " pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.140426 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.164331 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77"] Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.165392 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.168404 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-k2cgd" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.168893 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.169198 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.176827 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77"] Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.256385 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnjwn\" (UniqueName: \"kubernetes.io/projected/bca856e8-824f-41ea-999f-353b10773511-kube-api-access-xnjwn\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.256510 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bca856e8-824f-41ea-999f-353b10773511-apiservice-cert\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.256538 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bca856e8-824f-41ea-999f-353b10773511-webhook-cert\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.358859 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bca856e8-824f-41ea-999f-353b10773511-apiservice-cert\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.358931 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bca856e8-824f-41ea-999f-353b10773511-webhook-cert\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.359000 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnjwn\" (UniqueName: \"kubernetes.io/projected/bca856e8-824f-41ea-999f-353b10773511-kube-api-access-xnjwn\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.363909 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bca856e8-824f-41ea-999f-353b10773511-webhook-cert\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.381276 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bca856e8-824f-41ea-999f-353b10773511-apiservice-cert\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.388738 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnjwn\" (UniqueName: \"kubernetes.io/projected/bca856e8-824f-41ea-999f-353b10773511-kube-api-access-xnjwn\") pod \"metallb-operator-webhook-server-6555f87c79-t9n77\" (UID: \"bca856e8-824f-41ea-999f-353b10773511\") " pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.428927 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4"] Mar 10 14:17:51 crc kubenswrapper[4911]: W0310 14:17:51.442207 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab11607_0b3a_4d76_bb7d_34fd3c9fa271.slice/crio-5d86fbf53752dd8276e9b85b284291568ec5c77b8111e3e4962de0eb00e478bf WatchSource:0}: Error finding container 5d86fbf53752dd8276e9b85b284291568ec5c77b8111e3e4962de0eb00e478bf: Status 404 returned error can't find the container with id 5d86fbf53752dd8276e9b85b284291568ec5c77b8111e3e4962de0eb00e478bf Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.524031 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.779137 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" event={"ID":"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271","Type":"ContainerStarted","Data":"5d86fbf53752dd8276e9b85b284291568ec5c77b8111e3e4962de0eb00e478bf"} Mar 10 14:17:51 crc kubenswrapper[4911]: I0310 14:17:51.802481 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77"] Mar 10 14:17:51 crc kubenswrapper[4911]: W0310 14:17:51.806966 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbca856e8_824f_41ea_999f_353b10773511.slice/crio-71ce86dd110886088c0bfef5836735aee7b904aa2655093829b7b5f3ad10dfd3 WatchSource:0}: Error finding container 71ce86dd110886088c0bfef5836735aee7b904aa2655093829b7b5f3ad10dfd3: Status 404 returned error can't find the container with id 71ce86dd110886088c0bfef5836735aee7b904aa2655093829b7b5f3ad10dfd3 Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.655527 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wlbn4"] Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.657343 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.666203 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wlbn4"] Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.676664 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-catalog-content\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.676810 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqpt\" (UniqueName: \"kubernetes.io/projected/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-kube-api-access-4jqpt\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.676836 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-utilities\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.778536 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-catalog-content\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.778669 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jqpt\" (UniqueName: \"kubernetes.io/projected/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-kube-api-access-4jqpt\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.778695 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-utilities\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.779267 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-utilities\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.779297 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-catalog-content\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.789272 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" event={"ID":"bca856e8-824f-41ea-999f-353b10773511","Type":"ContainerStarted","Data":"71ce86dd110886088c0bfef5836735aee7b904aa2655093829b7b5f3ad10dfd3"} Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.816417 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jqpt\" (UniqueName: \"kubernetes.io/projected/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-kube-api-access-4jqpt\") pod \"community-operators-wlbn4\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:52 crc kubenswrapper[4911]: I0310 14:17:52.976233 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:17:53 crc kubenswrapper[4911]: I0310 14:17:53.234380 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wlbn4"] Mar 10 14:17:53 crc kubenswrapper[4911]: W0310 14:17:53.268589 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff4a908_309b_4b6f_a1b2_b48a41d9a5b8.slice/crio-29c11ae7be0c7e5ac64a274b3c33e50c099cc2df79497ba7ddc2e645c8e680fa WatchSource:0}: Error finding container 29c11ae7be0c7e5ac64a274b3c33e50c099cc2df79497ba7ddc2e645c8e680fa: Status 404 returned error can't find the container with id 29c11ae7be0c7e5ac64a274b3c33e50c099cc2df79497ba7ddc2e645c8e680fa Mar 10 14:17:53 crc kubenswrapper[4911]: I0310 14:17:53.800809 4911 generic.go:334] "Generic (PLEG): container finished" podID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerID="64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7" exitCode=0 Mar 10 14:17:53 crc kubenswrapper[4911]: I0310 14:17:53.801006 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbn4" event={"ID":"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8","Type":"ContainerDied","Data":"64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7"} Mar 10 14:17:53 crc kubenswrapper[4911]: I0310 14:17:53.801259 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbn4" event={"ID":"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8","Type":"ContainerStarted","Data":"29c11ae7be0c7e5ac64a274b3c33e50c099cc2df79497ba7ddc2e645c8e680fa"} Mar 10 14:17:55 crc kubenswrapper[4911]: I0310 14:17:55.819527 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbn4" event={"ID":"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8","Type":"ContainerStarted","Data":"05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305"} Mar 10 14:17:55 crc kubenswrapper[4911]: I0310 14:17:55.823047 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" event={"ID":"2ab11607-0b3a-4d76-bb7d-34fd3c9fa271","Type":"ContainerStarted","Data":"3cb305483026cdd7ddd641570499b543dd3ba35834a87d4cc6c6f56271d97748"} Mar 10 14:17:55 crc kubenswrapper[4911]: I0310 14:17:55.823313 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:17:55 crc kubenswrapper[4911]: I0310 14:17:55.873975 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" podStartSLOduration=1.7996305970000002 podStartE2EDuration="5.873947146s" podCreationTimestamp="2026-03-10 14:17:50 +0000 UTC" firstStartedPulling="2026-03-10 14:17:51.448625198 +0000 UTC m=+976.012145115" lastFinishedPulling="2026-03-10 14:17:55.522941747 +0000 UTC m=+980.086461664" observedRunningTime="2026-03-10 14:17:55.873094253 +0000 UTC m=+980.436614170" watchObservedRunningTime="2026-03-10 14:17:55.873947146 +0000 UTC m=+980.437467063" Mar 10 14:17:56 crc kubenswrapper[4911]: I0310 14:17:56.834519 4911 generic.go:334] "Generic (PLEG): container finished" podID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerID="05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305" exitCode=0 Mar 10 14:17:56 crc kubenswrapper[4911]: I0310 14:17:56.834636 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbn4" event={"ID":"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8","Type":"ContainerDied","Data":"05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305"} Mar 10 14:17:58 crc kubenswrapper[4911]: I0310 14:17:58.850487 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" event={"ID":"bca856e8-824f-41ea-999f-353b10773511","Type":"ContainerStarted","Data":"e8d1914b5185e738244bfd371e8e420a2116c70d29e0ecf044177bb223bba729"} Mar 10 14:17:58 crc kubenswrapper[4911]: I0310 14:17:58.851480 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:17:58 crc kubenswrapper[4911]: I0310 14:17:58.853457 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbn4" event={"ID":"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8","Type":"ContainerStarted","Data":"e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed"} Mar 10 14:17:58 crc kubenswrapper[4911]: I0310 14:17:58.884243 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" podStartSLOduration=1.5256266410000001 podStartE2EDuration="7.884221441s" podCreationTimestamp="2026-03-10 14:17:51 +0000 UTC" firstStartedPulling="2026-03-10 14:17:51.81088541 +0000 UTC m=+976.374405327" lastFinishedPulling="2026-03-10 14:17:58.16948021 +0000 UTC m=+982.733000127" observedRunningTime="2026-03-10 14:17:58.876703538 +0000 UTC m=+983.440223455" watchObservedRunningTime="2026-03-10 14:17:58.884221441 +0000 UTC m=+983.447741348" Mar 10 14:17:58 crc kubenswrapper[4911]: I0310 14:17:58.906326 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wlbn4" podStartSLOduration=2.230001598 podStartE2EDuration="6.906301346s" podCreationTimestamp="2026-03-10 14:17:52 +0000 UTC" firstStartedPulling="2026-03-10 14:17:53.810899392 +0000 UTC m=+978.374419309" lastFinishedPulling="2026-03-10 14:17:58.48719914 +0000 UTC m=+983.050719057" observedRunningTime="2026-03-10 14:17:58.900764627 +0000 UTC m=+983.464284544" watchObservedRunningTime="2026-03-10 14:17:58.906301346 +0000 UTC m=+983.469821273" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.136864 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552538-v5wxw"] Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.138272 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552538-v5wxw" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.140801 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.140879 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.141713 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.153828 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552538-v5wxw"] Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.302467 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79vkj\" (UniqueName: \"kubernetes.io/projected/2581a8bf-f280-4d52-a410-e1b8a019bfaa-kube-api-access-79vkj\") pod \"auto-csr-approver-29552538-v5wxw\" (UID: \"2581a8bf-f280-4d52-a410-e1b8a019bfaa\") " pod="openshift-infra/auto-csr-approver-29552538-v5wxw" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.404227 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79vkj\" (UniqueName: \"kubernetes.io/projected/2581a8bf-f280-4d52-a410-e1b8a019bfaa-kube-api-access-79vkj\") pod \"auto-csr-approver-29552538-v5wxw\" (UID: \"2581a8bf-f280-4d52-a410-e1b8a019bfaa\") " pod="openshift-infra/auto-csr-approver-29552538-v5wxw" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.428154 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79vkj\" (UniqueName: \"kubernetes.io/projected/2581a8bf-f280-4d52-a410-e1b8a019bfaa-kube-api-access-79vkj\") pod \"auto-csr-approver-29552538-v5wxw\" (UID: \"2581a8bf-f280-4d52-a410-e1b8a019bfaa\") " pod="openshift-infra/auto-csr-approver-29552538-v5wxw" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.459905 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552538-v5wxw" Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.842941 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552538-v5wxw"] Mar 10 14:18:00 crc kubenswrapper[4911]: I0310 14:18:00.865733 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552538-v5wxw" event={"ID":"2581a8bf-f280-4d52-a410-e1b8a019bfaa","Type":"ContainerStarted","Data":"bdc7336323c1005606d06da18bc600aa99c6b194bfe390e5669f2bbd0658a182"} Mar 10 14:18:02 crc kubenswrapper[4911]: I0310 14:18:02.883855 4911 generic.go:334] "Generic (PLEG): container finished" podID="2581a8bf-f280-4d52-a410-e1b8a019bfaa" containerID="b901f388fd87ff94d84fda609875a64624b2019071ed736af354521df410dab5" exitCode=0 Mar 10 14:18:02 crc kubenswrapper[4911]: I0310 14:18:02.883965 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552538-v5wxw" event={"ID":"2581a8bf-f280-4d52-a410-e1b8a019bfaa","Type":"ContainerDied","Data":"b901f388fd87ff94d84fda609875a64624b2019071ed736af354521df410dab5"} Mar 10 14:18:02 crc kubenswrapper[4911]: I0310 14:18:02.976716 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:18:02 crc kubenswrapper[4911]: I0310 14:18:02.976797 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:18:03 crc kubenswrapper[4911]: I0310 14:18:03.063742 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:18:03 crc kubenswrapper[4911]: I0310 14:18:03.932611 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:18:04 crc kubenswrapper[4911]: I0310 14:18:04.165858 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552538-v5wxw" Mar 10 14:18:04 crc kubenswrapper[4911]: I0310 14:18:04.267123 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79vkj\" (UniqueName: \"kubernetes.io/projected/2581a8bf-f280-4d52-a410-e1b8a019bfaa-kube-api-access-79vkj\") pod \"2581a8bf-f280-4d52-a410-e1b8a019bfaa\" (UID: \"2581a8bf-f280-4d52-a410-e1b8a019bfaa\") " Mar 10 14:18:04 crc kubenswrapper[4911]: I0310 14:18:04.272880 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2581a8bf-f280-4d52-a410-e1b8a019bfaa-kube-api-access-79vkj" (OuterVolumeSpecName: "kube-api-access-79vkj") pod "2581a8bf-f280-4d52-a410-e1b8a019bfaa" (UID: "2581a8bf-f280-4d52-a410-e1b8a019bfaa"). InnerVolumeSpecName "kube-api-access-79vkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:18:04 crc kubenswrapper[4911]: I0310 14:18:04.368668 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79vkj\" (UniqueName: \"kubernetes.io/projected/2581a8bf-f280-4d52-a410-e1b8a019bfaa-kube-api-access-79vkj\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:04 crc kubenswrapper[4911]: I0310 14:18:04.897148 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552538-v5wxw" Mar 10 14:18:04 crc kubenswrapper[4911]: I0310 14:18:04.897144 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552538-v5wxw" event={"ID":"2581a8bf-f280-4d52-a410-e1b8a019bfaa","Type":"ContainerDied","Data":"bdc7336323c1005606d06da18bc600aa99c6b194bfe390e5669f2bbd0658a182"} Mar 10 14:18:04 crc kubenswrapper[4911]: I0310 14:18:04.897237 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc7336323c1005606d06da18bc600aa99c6b194bfe390e5669f2bbd0658a182" Mar 10 14:18:05 crc kubenswrapper[4911]: I0310 14:18:05.240968 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552532-ln5mn"] Mar 10 14:18:05 crc kubenswrapper[4911]: I0310 14:18:05.245693 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552532-ln5mn"] Mar 10 14:18:05 crc kubenswrapper[4911]: I0310 14:18:05.435276 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wlbn4"] Mar 10 14:18:05 crc kubenswrapper[4911]: I0310 14:18:05.906325 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wlbn4" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerName="registry-server" containerID="cri-o://e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed" gracePeriod=2 Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.217077 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac79edf7-7478-4f46-b74e-a3db1d75d52f" path="/var/lib/kubelet/pods/ac79edf7-7478-4f46-b74e-a3db1d75d52f/volumes" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.330229 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.504056 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-catalog-content\") pod \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.504178 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jqpt\" (UniqueName: \"kubernetes.io/projected/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-kube-api-access-4jqpt\") pod \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.504204 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-utilities\") pod \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\" (UID: \"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8\") " Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.505103 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-utilities" (OuterVolumeSpecName: "utilities") pod "5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" (UID: "5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.517940 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-kube-api-access-4jqpt" (OuterVolumeSpecName: "kube-api-access-4jqpt") pod "5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" (UID: "5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8"). InnerVolumeSpecName "kube-api-access-4jqpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.606210 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jqpt\" (UniqueName: \"kubernetes.io/projected/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-kube-api-access-4jqpt\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.606258 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.919849 4911 generic.go:334] "Generic (PLEG): container finished" podID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerID="e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed" exitCode=0 Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.919898 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbn4" event={"ID":"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8","Type":"ContainerDied","Data":"e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed"} Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.919933 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbn4" event={"ID":"5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8","Type":"ContainerDied","Data":"29c11ae7be0c7e5ac64a274b3c33e50c099cc2df79497ba7ddc2e645c8e680fa"} Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.919951 4911 scope.go:117] "RemoveContainer" containerID="e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.920482 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlbn4" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.952250 4911 scope.go:117] "RemoveContainer" containerID="05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305" Mar 10 14:18:06 crc kubenswrapper[4911]: I0310 14:18:06.973088 4911 scope.go:117] "RemoveContainer" containerID="64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.015210 4911 scope.go:117] "RemoveContainer" containerID="e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed" Mar 10 14:18:07 crc kubenswrapper[4911]: E0310 14:18:07.016082 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed\": container with ID starting with e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed not found: ID does not exist" containerID="e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.016146 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed"} err="failed to get container status \"e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed\": rpc error: code = NotFound desc = could not find container \"e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed\": container with ID starting with e9694d0c67b75198ee1bc45c17eb578ea4cdae1b97ecc99b8626f0ed037b23ed not found: ID does not exist" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.016184 4911 scope.go:117] "RemoveContainer" containerID="05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305" Mar 10 14:18:07 crc kubenswrapper[4911]: E0310 14:18:07.016830 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305\": container with ID starting with 05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305 not found: ID does not exist" containerID="05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.016916 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305"} err="failed to get container status \"05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305\": rpc error: code = NotFound desc = could not find container \"05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305\": container with ID starting with 05151755d6fa8bd66fb21dc38493a3f840d6a57e0d36b66c10e6905a805da305 not found: ID does not exist" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.016954 4911 scope.go:117] "RemoveContainer" containerID="64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7" Mar 10 14:18:07 crc kubenswrapper[4911]: E0310 14:18:07.018118 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7\": container with ID starting with 64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7 not found: ID does not exist" containerID="64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.018170 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7"} err="failed to get container status \"64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7\": rpc error: code = NotFound desc = could not find container \"64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7\": container with ID starting with 64e85a8edb88e5322fac85dcb136cafa16b56dc3dc1f18849d91c7a05ce8c0d7 not found: ID does not exist" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.071263 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" (UID: "5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.114420 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.264981 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wlbn4"] Mar 10 14:18:07 crc kubenswrapper[4911]: I0310 14:18:07.270156 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wlbn4"] Mar 10 14:18:08 crc kubenswrapper[4911]: I0310 14:18:08.201231 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" path="/var/lib/kubelet/pods/5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8/volumes" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.530849 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6555f87c79-t9n77" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.951860 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xbzc7"] Mar 10 14:18:11 crc kubenswrapper[4911]: E0310 14:18:11.952146 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerName="extract-utilities" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.952161 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerName="extract-utilities" Mar 10 14:18:11 crc kubenswrapper[4911]: E0310 14:18:11.952172 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2581a8bf-f280-4d52-a410-e1b8a019bfaa" containerName="oc" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.952179 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="2581a8bf-f280-4d52-a410-e1b8a019bfaa" containerName="oc" Mar 10 14:18:11 crc kubenswrapper[4911]: E0310 14:18:11.952190 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerName="extract-content" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.952196 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerName="extract-content" Mar 10 14:18:11 crc kubenswrapper[4911]: E0310 14:18:11.952213 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerName="registry-server" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.952219 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerName="registry-server" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.952315 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff4a908-309b-4b6f-a1b2-b48a41d9a5b8" containerName="registry-server" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.952329 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="2581a8bf-f280-4d52-a410-e1b8a019bfaa" containerName="oc" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.953373 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.963222 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbzc7"] Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.979128 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-catalog-content\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.979223 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-utilities\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:11 crc kubenswrapper[4911]: I0310 14:18:11.979263 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzd5s\" (UniqueName: \"kubernetes.io/projected/507a0b41-4ba9-44f3-adf7-40e21535397e-kube-api-access-tzd5s\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.094674 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-catalog-content\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.094799 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-utilities\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.094850 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzd5s\" (UniqueName: \"kubernetes.io/projected/507a0b41-4ba9-44f3-adf7-40e21535397e-kube-api-access-tzd5s\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.095956 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-catalog-content\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.096283 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-utilities\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.124597 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzd5s\" (UniqueName: \"kubernetes.io/projected/507a0b41-4ba9-44f3-adf7-40e21535397e-kube-api-access-tzd5s\") pod \"redhat-marketplace-xbzc7\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.272930 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:12 crc kubenswrapper[4911]: W0310 14:18:12.557285 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod507a0b41_4ba9_44f3_adf7_40e21535397e.slice/crio-23dc2a7c3bbbdb24b62d980434cb484b58abad10da8dce76d86b5354568bd20e WatchSource:0}: Error finding container 23dc2a7c3bbbdb24b62d980434cb484b58abad10da8dce76d86b5354568bd20e: Status 404 returned error can't find the container with id 23dc2a7c3bbbdb24b62d980434cb484b58abad10da8dce76d86b5354568bd20e Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.561248 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbzc7"] Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.956743 4911 generic.go:334] "Generic (PLEG): container finished" podID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerID="a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8" exitCode=0 Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.956802 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbzc7" event={"ID":"507a0b41-4ba9-44f3-adf7-40e21535397e","Type":"ContainerDied","Data":"a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8"} Mar 10 14:18:12 crc kubenswrapper[4911]: I0310 14:18:12.956839 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbzc7" event={"ID":"507a0b41-4ba9-44f3-adf7-40e21535397e","Type":"ContainerStarted","Data":"23dc2a7c3bbbdb24b62d980434cb484b58abad10da8dce76d86b5354568bd20e"} Mar 10 14:18:13 crc kubenswrapper[4911]: I0310 14:18:13.966101 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbzc7" event={"ID":"507a0b41-4ba9-44f3-adf7-40e21535397e","Type":"ContainerStarted","Data":"94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5"} Mar 10 14:18:14 crc kubenswrapper[4911]: I0310 14:18:14.974297 4911 generic.go:334] "Generic (PLEG): container finished" podID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerID="94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5" exitCode=0 Mar 10 14:18:14 crc kubenswrapper[4911]: I0310 14:18:14.974390 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbzc7" event={"ID":"507a0b41-4ba9-44f3-adf7-40e21535397e","Type":"ContainerDied","Data":"94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5"} Mar 10 14:18:15 crc kubenswrapper[4911]: I0310 14:18:15.983149 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbzc7" event={"ID":"507a0b41-4ba9-44f3-adf7-40e21535397e","Type":"ContainerStarted","Data":"b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0"} Mar 10 14:18:16 crc kubenswrapper[4911]: I0310 14:18:16.007332 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xbzc7" podStartSLOduration=2.452931156 podStartE2EDuration="5.007318563s" podCreationTimestamp="2026-03-10 14:18:11 +0000 UTC" firstStartedPulling="2026-03-10 14:18:12.958168109 +0000 UTC m=+997.521688026" lastFinishedPulling="2026-03-10 14:18:15.512555516 +0000 UTC m=+1000.076075433" observedRunningTime="2026-03-10 14:18:16.00462638 +0000 UTC m=+1000.568146297" watchObservedRunningTime="2026-03-10 14:18:16.007318563 +0000 UTC m=+1000.570838480" Mar 10 14:18:18 crc kubenswrapper[4911]: I0310 14:18:18.520564 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:18:18 crc kubenswrapper[4911]: I0310 14:18:18.520850 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.274105 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.274449 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.324476 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.719842 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ttfzk"] Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.721125 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.743711 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ttfzk"] Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.872357 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-catalog-content\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.872406 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4brpt\" (UniqueName: \"kubernetes.io/projected/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-kube-api-access-4brpt\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.872439 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-utilities\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.975419 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-catalog-content\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.975459 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4brpt\" (UniqueName: \"kubernetes.io/projected/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-kube-api-access-4brpt\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.975475 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-utilities\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.975923 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-utilities\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.976060 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-catalog-content\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:22 crc kubenswrapper[4911]: I0310 14:18:22.997061 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4brpt\" (UniqueName: \"kubernetes.io/projected/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-kube-api-access-4brpt\") pod \"certified-operators-ttfzk\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:23 crc kubenswrapper[4911]: I0310 14:18:23.046459 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:23 crc kubenswrapper[4911]: I0310 14:18:23.082446 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:23 crc kubenswrapper[4911]: I0310 14:18:23.305152 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ttfzk"] Mar 10 14:18:24 crc kubenswrapper[4911]: I0310 14:18:24.038976 4911 generic.go:334] "Generic (PLEG): container finished" podID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerID="9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8" exitCode=0 Mar 10 14:18:24 crc kubenswrapper[4911]: I0310 14:18:24.040017 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttfzk" event={"ID":"b0a43013-ecd0-4c18-a62b-efeeba73e7ce","Type":"ContainerDied","Data":"9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8"} Mar 10 14:18:24 crc kubenswrapper[4911]: I0310 14:18:24.040059 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttfzk" event={"ID":"b0a43013-ecd0-4c18-a62b-efeeba73e7ce","Type":"ContainerStarted","Data":"948b5a13f48d9e3cd875e506c9faee52c74680f1dfcf43c3776986d22ee14e5e"} Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.049797 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttfzk" event={"ID":"b0a43013-ecd0-4c18-a62b-efeeba73e7ce","Type":"ContainerStarted","Data":"26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0"} Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.390450 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbzc7"] Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.390896 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xbzc7" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerName="registry-server" containerID="cri-o://b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0" gracePeriod=2 Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.790762 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.921959 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzd5s\" (UniqueName: \"kubernetes.io/projected/507a0b41-4ba9-44f3-adf7-40e21535397e-kube-api-access-tzd5s\") pod \"507a0b41-4ba9-44f3-adf7-40e21535397e\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.922042 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-catalog-content\") pod \"507a0b41-4ba9-44f3-adf7-40e21535397e\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.922174 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-utilities\") pod \"507a0b41-4ba9-44f3-adf7-40e21535397e\" (UID: \"507a0b41-4ba9-44f3-adf7-40e21535397e\") " Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.923463 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-utilities" (OuterVolumeSpecName: "utilities") pod "507a0b41-4ba9-44f3-adf7-40e21535397e" (UID: "507a0b41-4ba9-44f3-adf7-40e21535397e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.929930 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507a0b41-4ba9-44f3-adf7-40e21535397e-kube-api-access-tzd5s" (OuterVolumeSpecName: "kube-api-access-tzd5s") pod "507a0b41-4ba9-44f3-adf7-40e21535397e" (UID: "507a0b41-4ba9-44f3-adf7-40e21535397e"). InnerVolumeSpecName "kube-api-access-tzd5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:18:25 crc kubenswrapper[4911]: I0310 14:18:25.963084 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "507a0b41-4ba9-44f3-adf7-40e21535397e" (UID: "507a0b41-4ba9-44f3-adf7-40e21535397e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.024309 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzd5s\" (UniqueName: \"kubernetes.io/projected/507a0b41-4ba9-44f3-adf7-40e21535397e-kube-api-access-tzd5s\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.024357 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.024370 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a0b41-4ba9-44f3-adf7-40e21535397e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.058552 4911 generic.go:334] "Generic (PLEG): container finished" podID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerID="26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0" exitCode=0 Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.058628 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttfzk" event={"ID":"b0a43013-ecd0-4c18-a62b-efeeba73e7ce","Type":"ContainerDied","Data":"26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0"} Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.064468 4911 generic.go:334] "Generic (PLEG): container finished" podID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerID="b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0" exitCode=0 Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.064522 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbzc7" event={"ID":"507a0b41-4ba9-44f3-adf7-40e21535397e","Type":"ContainerDied","Data":"b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0"} Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.064556 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbzc7" event={"ID":"507a0b41-4ba9-44f3-adf7-40e21535397e","Type":"ContainerDied","Data":"23dc2a7c3bbbdb24b62d980434cb484b58abad10da8dce76d86b5354568bd20e"} Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.064584 4911 scope.go:117] "RemoveContainer" containerID="b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.064792 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xbzc7" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.099537 4911 scope.go:117] "RemoveContainer" containerID="94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.112526 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbzc7"] Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.120436 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbzc7"] Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.132268 4911 scope.go:117] "RemoveContainer" containerID="a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.154612 4911 scope.go:117] "RemoveContainer" containerID="b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0" Mar 10 14:18:26 crc kubenswrapper[4911]: E0310 14:18:26.155027 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0\": container with ID starting with b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0 not found: ID does not exist" containerID="b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.155065 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0"} err="failed to get container status \"b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0\": rpc error: code = NotFound desc = could not find container \"b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0\": container with ID starting with b628912d2e74568e0ec87b2e76bbe958c4b600d008eb250d6ce318cf0242f0e0 not found: ID does not exist" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.155090 4911 scope.go:117] "RemoveContainer" containerID="94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5" Mar 10 14:18:26 crc kubenswrapper[4911]: E0310 14:18:26.155317 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5\": container with ID starting with 94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5 not found: ID does not exist" containerID="94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.155345 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5"} err="failed to get container status \"94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5\": rpc error: code = NotFound desc = could not find container \"94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5\": container with ID starting with 94a298e3e1c29d916504c6857ef1ca3049de8a356fe5f73df81da6250ad7afa5 not found: ID does not exist" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.155357 4911 scope.go:117] "RemoveContainer" containerID="a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8" Mar 10 14:18:26 crc kubenswrapper[4911]: E0310 14:18:26.155573 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8\": container with ID starting with a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8 not found: ID does not exist" containerID="a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.155602 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8"} err="failed to get container status \"a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8\": rpc error: code = NotFound desc = could not find container \"a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8\": container with ID starting with a3df4ba66aa7f23cb1187e3f34f5ff106a728e5de4f1daf2586714dd117981d8 not found: ID does not exist" Mar 10 14:18:26 crc kubenswrapper[4911]: I0310 14:18:26.203941 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" path="/var/lib/kubelet/pods/507a0b41-4ba9-44f3-adf7-40e21535397e/volumes" Mar 10 14:18:27 crc kubenswrapper[4911]: I0310 14:18:27.073902 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttfzk" event={"ID":"b0a43013-ecd0-4c18-a62b-efeeba73e7ce","Type":"ContainerStarted","Data":"51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1"} Mar 10 14:18:27 crc kubenswrapper[4911]: I0310 14:18:27.096449 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ttfzk" podStartSLOduration=2.648774148 podStartE2EDuration="5.096425156s" podCreationTimestamp="2026-03-10 14:18:22 +0000 UTC" firstStartedPulling="2026-03-10 14:18:24.04275781 +0000 UTC m=+1008.606277767" lastFinishedPulling="2026-03-10 14:18:26.490408848 +0000 UTC m=+1011.053928775" observedRunningTime="2026-03-10 14:18:27.091825722 +0000 UTC m=+1011.655345649" watchObservedRunningTime="2026-03-10 14:18:27.096425156 +0000 UTC m=+1011.659945073" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.145035 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d5b96ddc6-xbpp4" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.884367 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx"] Mar 10 14:18:31 crc kubenswrapper[4911]: E0310 14:18:31.885075 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerName="registry-server" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.885100 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerName="registry-server" Mar 10 14:18:31 crc kubenswrapper[4911]: E0310 14:18:31.885117 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerName="extract-content" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.885124 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerName="extract-content" Mar 10 14:18:31 crc kubenswrapper[4911]: E0310 14:18:31.885136 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerName="extract-utilities" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.885144 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerName="extract-utilities" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.885310 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="507a0b41-4ba9-44f3-adf7-40e21535397e" containerName="registry-server" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.886063 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.887876 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6pqxq"] Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.889621 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.889631 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-dmdrc" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.890247 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.894159 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.894468 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.906151 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx"] Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914191 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-frr-sockets\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914239 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ad777989-99eb-4ec3-91d2-890190261a26-frr-startup\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914330 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m2cq\" (UniqueName: \"kubernetes.io/projected/ad777989-99eb-4ec3-91d2-890190261a26-kube-api-access-5m2cq\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914443 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-metrics\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914474 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnpp\" (UniqueName: \"kubernetes.io/projected/26684eb9-05cb-450a-b9b3-225b34518a92-kube-api-access-ttnpp\") pod \"frr-k8s-webhook-server-7f989f654f-npzpx\" (UID: \"26684eb9-05cb-450a-b9b3-225b34518a92\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914505 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad777989-99eb-4ec3-91d2-890190261a26-metrics-certs\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914577 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-reloader\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914609 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26684eb9-05cb-450a-b9b3-225b34518a92-cert\") pod \"frr-k8s-webhook-server-7f989f654f-npzpx\" (UID: \"26684eb9-05cb-450a-b9b3-225b34518a92\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:31 crc kubenswrapper[4911]: I0310 14:18:31.914661 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-frr-conf\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.010153 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-w9cpd"] Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.011467 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.014251 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.014281 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015257 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-reloader\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015314 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26684eb9-05cb-450a-b9b3-225b34518a92-cert\") pod \"frr-k8s-webhook-server-7f989f654f-npzpx\" (UID: \"26684eb9-05cb-450a-b9b3-225b34518a92\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015350 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpmsw\" (UniqueName: \"kubernetes.io/projected/884e47b6-07a7-4d77-b73a-ffa7a9a59807-kube-api-access-kpmsw\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015373 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-frr-conf\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015398 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metallb-excludel2\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015422 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-frr-sockets\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015441 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015461 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ad777989-99eb-4ec3-91d2-890190261a26-frr-startup\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015487 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metrics-certs\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: E0310 14:18:32.015558 4911 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 10 14:18:32 crc kubenswrapper[4911]: E0310 14:18:32.015618 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26684eb9-05cb-450a-b9b3-225b34518a92-cert podName:26684eb9-05cb-450a-b9b3-225b34518a92 nodeName:}" failed. No retries permitted until 2026-03-10 14:18:32.515597445 +0000 UTC m=+1017.079117362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26684eb9-05cb-450a-b9b3-225b34518a92-cert") pod "frr-k8s-webhook-server-7f989f654f-npzpx" (UID: "26684eb9-05cb-450a-b9b3-225b34518a92") : secret "frr-k8s-webhook-server-cert" not found Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015634 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015689 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m2cq\" (UniqueName: \"kubernetes.io/projected/ad777989-99eb-4ec3-91d2-890190261a26-kube-api-access-5m2cq\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015904 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-metrics\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015922 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-reloader\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015942 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnpp\" (UniqueName: \"kubernetes.io/projected/26684eb9-05cb-450a-b9b3-225b34518a92-kube-api-access-ttnpp\") pod \"frr-k8s-webhook-server-7f989f654f-npzpx\" (UID: \"26684eb9-05cb-450a-b9b3-225b34518a92\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015979 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad777989-99eb-4ec3-91d2-890190261a26-metrics-certs\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.015983 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-frr-conf\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.016092 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-frr-sockets\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.016305 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ad777989-99eb-4ec3-91d2-890190261a26-metrics\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.016695 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ad777989-99eb-4ec3-91d2-890190261a26-frr-startup\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.018872 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wjbx4" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.038589 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad777989-99eb-4ec3-91d2-890190261a26-metrics-certs\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.042571 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m2cq\" (UniqueName: \"kubernetes.io/projected/ad777989-99eb-4ec3-91d2-890190261a26-kube-api-access-5m2cq\") pod \"frr-k8s-6pqxq\" (UID: \"ad777989-99eb-4ec3-91d2-890190261a26\") " pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.053396 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnpp\" (UniqueName: \"kubernetes.io/projected/26684eb9-05cb-450a-b9b3-225b34518a92-kube-api-access-ttnpp\") pod \"frr-k8s-webhook-server-7f989f654f-npzpx\" (UID: \"26684eb9-05cb-450a-b9b3-225b34518a92\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.071779 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-78clj"] Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.072920 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.080056 4911 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.086557 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-78clj"] Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.118961 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpmsw\" (UniqueName: \"kubernetes.io/projected/884e47b6-07a7-4d77-b73a-ffa7a9a59807-kube-api-access-kpmsw\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.119028 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metallb-excludel2\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.119078 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.119109 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metrics-certs\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: E0310 14:18:32.119300 4911 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 10 14:18:32 crc kubenswrapper[4911]: E0310 14:18:32.119355 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metrics-certs podName:884e47b6-07a7-4d77-b73a-ffa7a9a59807 nodeName:}" failed. No retries permitted until 2026-03-10 14:18:32.619336454 +0000 UTC m=+1017.182856371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metrics-certs") pod "speaker-w9cpd" (UID: "884e47b6-07a7-4d77-b73a-ffa7a9a59807") : secret "speaker-certs-secret" not found Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.120653 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metallb-excludel2\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: E0310 14:18:32.120741 4911 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 14:18:32 crc kubenswrapper[4911]: E0310 14:18:32.120796 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist podName:884e47b6-07a7-4d77-b73a-ffa7a9a59807 nodeName:}" failed. No retries permitted until 2026-03-10 14:18:32.620787083 +0000 UTC m=+1017.184307000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist") pod "speaker-w9cpd" (UID: "884e47b6-07a7-4d77-b73a-ffa7a9a59807") : secret "metallb-memberlist" not found Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.161405 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpmsw\" (UniqueName: \"kubernetes.io/projected/884e47b6-07a7-4d77-b73a-ffa7a9a59807-kube-api-access-kpmsw\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.211948 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.220617 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-cert\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.220830 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-metrics-certs\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.220885 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6dlp\" (UniqueName: \"kubernetes.io/projected/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-kube-api-access-q6dlp\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.322756 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-cert\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.322937 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-metrics-certs\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.322998 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6dlp\" (UniqueName: \"kubernetes.io/projected/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-kube-api-access-q6dlp\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.326781 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-cert\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.326832 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-metrics-certs\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.343084 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6dlp\" (UniqueName: \"kubernetes.io/projected/1220afb2-8f3a-4b0c-8b88-9e690005eaf2-kube-api-access-q6dlp\") pod \"controller-86ddb6bd46-78clj\" (UID: \"1220afb2-8f3a-4b0c-8b88-9e690005eaf2\") " pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.442217 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.526584 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26684eb9-05cb-450a-b9b3-225b34518a92-cert\") pod \"frr-k8s-webhook-server-7f989f654f-npzpx\" (UID: \"26684eb9-05cb-450a-b9b3-225b34518a92\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.536824 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26684eb9-05cb-450a-b9b3-225b34518a92-cert\") pod \"frr-k8s-webhook-server-7f989f654f-npzpx\" (UID: \"26684eb9-05cb-450a-b9b3-225b34518a92\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.629200 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metrics-certs\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.632093 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: E0310 14:18:32.632285 4911 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 14:18:32 crc kubenswrapper[4911]: E0310 14:18:32.632413 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist podName:884e47b6-07a7-4d77-b73a-ffa7a9a59807 nodeName:}" failed. No retries permitted until 2026-03-10 14:18:33.632385894 +0000 UTC m=+1018.195905821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist") pod "speaker-w9cpd" (UID: "884e47b6-07a7-4d77-b73a-ffa7a9a59807") : secret "metallb-memberlist" not found Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.636449 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-metrics-certs\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.713262 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-78clj"] Mar 10 14:18:32 crc kubenswrapper[4911]: W0310 14:18:32.724273 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1220afb2_8f3a_4b0c_8b88_9e690005eaf2.slice/crio-59e318c3687ea2e849084fef8c52d28e675b77ec684f7795748cd575fe622e5b WatchSource:0}: Error finding container 59e318c3687ea2e849084fef8c52d28e675b77ec684f7795748cd575fe622e5b: Status 404 returned error can't find the container with id 59e318c3687ea2e849084fef8c52d28e675b77ec684f7795748cd575fe622e5b Mar 10 14:18:32 crc kubenswrapper[4911]: I0310 14:18:32.803118 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.047137 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.047521 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.067501 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx"] Mar 10 14:18:33 crc kubenswrapper[4911]: W0310 14:18:33.071230 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26684eb9_05cb_450a_b9b3_225b34518a92.slice/crio-52154ac248163975300c2270abef7624effbae13e1b28d7f58e726a0737df339 WatchSource:0}: Error finding container 52154ac248163975300c2270abef7624effbae13e1b28d7f58e726a0737df339: Status 404 returned error can't find the container with id 52154ac248163975300c2270abef7624effbae13e1b28d7f58e726a0737df339 Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.112100 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.145202 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerStarted","Data":"841f00f597d8f90f43d1e0ab0c39c06cf0a6ee5d5da5daf19747bb522f66e290"} Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.148279 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" event={"ID":"26684eb9-05cb-450a-b9b3-225b34518a92","Type":"ContainerStarted","Data":"52154ac248163975300c2270abef7624effbae13e1b28d7f58e726a0737df339"} Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.151005 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-78clj" event={"ID":"1220afb2-8f3a-4b0c-8b88-9e690005eaf2","Type":"ContainerStarted","Data":"b6eb88ab96ecfd90d5cbc0ae2e59bfcd388dea1ccdba74964502350973391861"} Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.151037 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-78clj" event={"ID":"1220afb2-8f3a-4b0c-8b88-9e690005eaf2","Type":"ContainerStarted","Data":"e12e2f56e5c1acb23ffdaeec45a84ef3d272a726da06c805ec9852e3a96e39ab"} Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.151048 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-78clj" event={"ID":"1220afb2-8f3a-4b0c-8b88-9e690005eaf2","Type":"ContainerStarted","Data":"59e318c3687ea2e849084fef8c52d28e675b77ec684f7795748cd575fe622e5b"} Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.151388 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.178194 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-78clj" podStartSLOduration=1.178163007 podStartE2EDuration="1.178163007s" podCreationTimestamp="2026-03-10 14:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:18:33.169173314 +0000 UTC m=+1017.732693231" watchObservedRunningTime="2026-03-10 14:18:33.178163007 +0000 UTC m=+1017.741682924" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.204270 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.351481 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ttfzk"] Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.648115 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.653125 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/884e47b6-07a7-4d77-b73a-ffa7a9a59807-memberlist\") pod \"speaker-w9cpd\" (UID: \"884e47b6-07a7-4d77-b73a-ffa7a9a59807\") " pod="metallb-system/speaker-w9cpd" Mar 10 14:18:33 crc kubenswrapper[4911]: I0310 14:18:33.919331 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w9cpd" Mar 10 14:18:34 crc kubenswrapper[4911]: I0310 14:18:34.159447 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w9cpd" event={"ID":"884e47b6-07a7-4d77-b73a-ffa7a9a59807","Type":"ContainerStarted","Data":"b528b5ca6fed6a854bff4ed185269c0877c13d17d3e6f68662d3780830997713"} Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.168922 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w9cpd" event={"ID":"884e47b6-07a7-4d77-b73a-ffa7a9a59807","Type":"ContainerStarted","Data":"c0cae9b17f9e1c7639aa84ba9b5d7a90b30c650ab088e3593b0e8d34cd8f7aba"} Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.169182 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ttfzk" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerName="registry-server" containerID="cri-o://51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1" gracePeriod=2 Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.170749 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w9cpd" event={"ID":"884e47b6-07a7-4d77-b73a-ffa7a9a59807","Type":"ContainerStarted","Data":"953586b0c1c64113fca78cc20d3b7937c24d5b82ffb61ee249a5cb027c792016"} Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.201159 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-w9cpd" podStartSLOduration=4.201136629 podStartE2EDuration="4.201136629s" podCreationTimestamp="2026-03-10 14:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:18:35.198010915 +0000 UTC m=+1019.761530832" watchObservedRunningTime="2026-03-10 14:18:35.201136629 +0000 UTC m=+1019.764656546" Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.667946 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.789214 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-catalog-content\") pod \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.789431 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-utilities\") pod \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.789784 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4brpt\" (UniqueName: \"kubernetes.io/projected/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-kube-api-access-4brpt\") pod \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\" (UID: \"b0a43013-ecd0-4c18-a62b-efeeba73e7ce\") " Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.794569 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-utilities" (OuterVolumeSpecName: "utilities") pod "b0a43013-ecd0-4c18-a62b-efeeba73e7ce" (UID: "b0a43013-ecd0-4c18-a62b-efeeba73e7ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.808961 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-kube-api-access-4brpt" (OuterVolumeSpecName: "kube-api-access-4brpt") pod "b0a43013-ecd0-4c18-a62b-efeeba73e7ce" (UID: "b0a43013-ecd0-4c18-a62b-efeeba73e7ce"). InnerVolumeSpecName "kube-api-access-4brpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.882079 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0a43013-ecd0-4c18-a62b-efeeba73e7ce" (UID: "b0a43013-ecd0-4c18-a62b-efeeba73e7ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.891268 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.891303 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4brpt\" (UniqueName: \"kubernetes.io/projected/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-kube-api-access-4brpt\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:35 crc kubenswrapper[4911]: I0310 14:18:35.891315 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a43013-ecd0-4c18-a62b-efeeba73e7ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.181611 4911 generic.go:334] "Generic (PLEG): container finished" podID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerID="51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1" exitCode=0 Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.182695 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttfzk" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.183891 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttfzk" event={"ID":"b0a43013-ecd0-4c18-a62b-efeeba73e7ce","Type":"ContainerDied","Data":"51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1"} Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.183987 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-w9cpd" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.184010 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttfzk" event={"ID":"b0a43013-ecd0-4c18-a62b-efeeba73e7ce","Type":"ContainerDied","Data":"948b5a13f48d9e3cd875e506c9faee52c74680f1dfcf43c3776986d22ee14e5e"} Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.184036 4911 scope.go:117] "RemoveContainer" containerID="51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.231860 4911 scope.go:117] "RemoveContainer" containerID="26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.291031 4911 scope.go:117] "RemoveContainer" containerID="9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.308739 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ttfzk"] Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.320777 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ttfzk"] Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.335603 4911 scope.go:117] "RemoveContainer" containerID="51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1" Mar 10 14:18:36 crc kubenswrapper[4911]: E0310 14:18:36.336461 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1\": container with ID starting with 51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1 not found: ID does not exist" containerID="51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.336878 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1"} err="failed to get container status \"51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1\": rpc error: code = NotFound desc = could not find container \"51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1\": container with ID starting with 51ee734ee7008057cf98f58a6dd5db3dce3315121456cb763f621cc0e897e9e1 not found: ID does not exist" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.337249 4911 scope.go:117] "RemoveContainer" containerID="26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0" Mar 10 14:18:36 crc kubenswrapper[4911]: E0310 14:18:36.338344 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0\": container with ID starting with 26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0 not found: ID does not exist" containerID="26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.338395 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0"} err="failed to get container status \"26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0\": rpc error: code = NotFound desc = could not find container \"26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0\": container with ID starting with 26a3bf8b9caa05c6b268f0f7ce90ad4bb559730c6a94db48bf174a10782758b0 not found: ID does not exist" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.338430 4911 scope.go:117] "RemoveContainer" containerID="9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8" Mar 10 14:18:36 crc kubenswrapper[4911]: E0310 14:18:36.338647 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8\": container with ID starting with 9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8 not found: ID does not exist" containerID="9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.338676 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8"} err="failed to get container status \"9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8\": rpc error: code = NotFound desc = could not find container \"9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8\": container with ID starting with 9a2dd99ffe3379681aa619d6ab41db1fa9fb557c522176929ca7e6162a6981e8 not found: ID does not exist" Mar 10 14:18:36 crc kubenswrapper[4911]: I0310 14:18:36.954416 4911 scope.go:117] "RemoveContainer" containerID="1612c726eac636d45591ce038742c082ec5a547f4043d8170fb59a22ede3e44c" Mar 10 14:18:38 crc kubenswrapper[4911]: I0310 14:18:38.210346 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" path="/var/lib/kubelet/pods/b0a43013-ecd0-4c18-a62b-efeeba73e7ce/volumes" Mar 10 14:18:38 crc kubenswrapper[4911]: E0310 14:18:38.845621 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a43013_ecd0_4c18_a62b_efeeba73e7ce.slice\": RecentStats: unable to find data in memory cache]" Mar 10 14:18:41 crc kubenswrapper[4911]: I0310 14:18:41.238736 4911 generic.go:334] "Generic (PLEG): container finished" podID="ad777989-99eb-4ec3-91d2-890190261a26" containerID="bc6f5951496f033db909cb8b2e306f86e4c958cfd1a145521d6b849691a1a33c" exitCode=0 Mar 10 14:18:41 crc kubenswrapper[4911]: I0310 14:18:41.238870 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerDied","Data":"bc6f5951496f033db909cb8b2e306f86e4c958cfd1a145521d6b849691a1a33c"} Mar 10 14:18:41 crc kubenswrapper[4911]: I0310 14:18:41.242614 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" event={"ID":"26684eb9-05cb-450a-b9b3-225b34518a92","Type":"ContainerStarted","Data":"f95ca578a847cb74a8c5b8b4907d2326b7722d6ac632c3a2e5055b4b2a9c0d05"} Mar 10 14:18:41 crc kubenswrapper[4911]: I0310 14:18:41.242793 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:41 crc kubenswrapper[4911]: I0310 14:18:41.291221 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" podStartSLOduration=2.634819986 podStartE2EDuration="10.291189194s" podCreationTimestamp="2026-03-10 14:18:31 +0000 UTC" firstStartedPulling="2026-03-10 14:18:33.076037042 +0000 UTC m=+1017.639556949" lastFinishedPulling="2026-03-10 14:18:40.73240624 +0000 UTC m=+1025.295926157" observedRunningTime="2026-03-10 14:18:41.287574877 +0000 UTC m=+1025.851094794" watchObservedRunningTime="2026-03-10 14:18:41.291189194 +0000 UTC m=+1025.854709111" Mar 10 14:18:42 crc kubenswrapper[4911]: I0310 14:18:42.250304 4911 generic.go:334] "Generic (PLEG): container finished" podID="ad777989-99eb-4ec3-91d2-890190261a26" containerID="29874079afb27c0e7c996c13fee259d583c7e2306cb6e19399a1a5b32905dd62" exitCode=0 Mar 10 14:18:42 crc kubenswrapper[4911]: I0310 14:18:42.250416 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerDied","Data":"29874079afb27c0e7c996c13fee259d583c7e2306cb6e19399a1a5b32905dd62"} Mar 10 14:18:42 crc kubenswrapper[4911]: I0310 14:18:42.463155 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-78clj" Mar 10 14:18:43 crc kubenswrapper[4911]: I0310 14:18:43.259480 4911 generic.go:334] "Generic (PLEG): container finished" podID="ad777989-99eb-4ec3-91d2-890190261a26" containerID="bd0261761a33c449f100380fd5742da1a0a650473e7af1c8eb42fb014b24e75b" exitCode=0 Mar 10 14:18:43 crc kubenswrapper[4911]: I0310 14:18:43.259550 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerDied","Data":"bd0261761a33c449f100380fd5742da1a0a650473e7af1c8eb42fb014b24e75b"} Mar 10 14:18:44 crc kubenswrapper[4911]: I0310 14:18:44.271646 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerStarted","Data":"b5eac821693e3337aa3e2ec65299774284560fafb1d630c5799819bc3eb9e48f"} Mar 10 14:18:44 crc kubenswrapper[4911]: I0310 14:18:44.272005 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerStarted","Data":"1a3867e88314d2b3740f0638c21ade20e167c7dea6e324861158dc299369ace4"} Mar 10 14:18:44 crc kubenswrapper[4911]: I0310 14:18:44.272023 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerStarted","Data":"822192c24ae9444a8e004de86c1abf14652fbd2692714e2dfcefc275890fa333"} Mar 10 14:18:44 crc kubenswrapper[4911]: I0310 14:18:44.272035 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerStarted","Data":"5dfeff9de10351bc5449be71772facef7a1c43b74e2e00c9ac9dfe80899eaf86"} Mar 10 14:18:44 crc kubenswrapper[4911]: I0310 14:18:44.272048 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerStarted","Data":"9d62bd37e993362e5aec04b160688e366dbb96f11edad14560ac3256a86f7784"} Mar 10 14:18:45 crc kubenswrapper[4911]: I0310 14:18:45.284531 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6pqxq" event={"ID":"ad777989-99eb-4ec3-91d2-890190261a26","Type":"ContainerStarted","Data":"51ffef79c86fa1afb37c827b3c4442dcb91cff6e1f5829e473beccdc59a79bd2"} Mar 10 14:18:45 crc kubenswrapper[4911]: I0310 14:18:45.318427 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6pqxq" podStartSLOduration=5.95866015 podStartE2EDuration="14.318389342s" podCreationTimestamp="2026-03-10 14:18:31 +0000 UTC" firstStartedPulling="2026-03-10 14:18:32.360818528 +0000 UTC m=+1016.924338445" lastFinishedPulling="2026-03-10 14:18:40.72054766 +0000 UTC m=+1025.284067637" observedRunningTime="2026-03-10 14:18:45.311608359 +0000 UTC m=+1029.875128316" watchObservedRunningTime="2026-03-10 14:18:45.318389342 +0000 UTC m=+1029.881909299" Mar 10 14:18:46 crc kubenswrapper[4911]: I0310 14:18:46.291697 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:47 crc kubenswrapper[4911]: I0310 14:18:47.212933 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:47 crc kubenswrapper[4911]: I0310 14:18:47.270575 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:18:48 crc kubenswrapper[4911]: I0310 14:18:48.521150 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:18:48 crc kubenswrapper[4911]: I0310 14:18:48.521484 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:18:48 crc kubenswrapper[4911]: I0310 14:18:48.521544 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:18:48 crc kubenswrapper[4911]: I0310 14:18:48.522444 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"064f54de59fb1087deb1f06362fea8b7318f6c645504d0d54010b3ae33528b2f"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:18:48 crc kubenswrapper[4911]: I0310 14:18:48.522491 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://064f54de59fb1087deb1f06362fea8b7318f6c645504d0d54010b3ae33528b2f" gracePeriod=600 Mar 10 14:18:48 crc kubenswrapper[4911]: E0310 14:18:48.976928 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a43013_ecd0_4c18_a62b_efeeba73e7ce.slice\": RecentStats: unable to find data in memory cache]" Mar 10 14:18:49 crc kubenswrapper[4911]: I0310 14:18:49.356265 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="064f54de59fb1087deb1f06362fea8b7318f6c645504d0d54010b3ae33528b2f" exitCode=0 Mar 10 14:18:49 crc kubenswrapper[4911]: I0310 14:18:49.356321 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"064f54de59fb1087deb1f06362fea8b7318f6c645504d0d54010b3ae33528b2f"} Mar 10 14:18:49 crc kubenswrapper[4911]: I0310 14:18:49.356733 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"b451ff8e7fd4c75aa2b34c26affcab379b47b137f2e280e6643a4a5092850d94"} Mar 10 14:18:49 crc kubenswrapper[4911]: I0310 14:18:49.356814 4911 scope.go:117] "RemoveContainer" containerID="857ad61597498c0292e86491603a433b330dd14022c00daae311c70410368529" Mar 10 14:18:52 crc kubenswrapper[4911]: I0310 14:18:52.809638 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-npzpx" Mar 10 14:18:53 crc kubenswrapper[4911]: I0310 14:18:53.923831 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-w9cpd" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.990150 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6bwhl"] Mar 10 14:18:56 crc kubenswrapper[4911]: E0310 14:18:56.990794 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerName="extract-utilities" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.990811 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerName="extract-utilities" Mar 10 14:18:56 crc kubenswrapper[4911]: E0310 14:18:56.990821 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerName="registry-server" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.990829 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerName="registry-server" Mar 10 14:18:56 crc kubenswrapper[4911]: E0310 14:18:56.990844 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerName="extract-content" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.990853 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerName="extract-content" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.991005 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a43013-ecd0-4c18-a62b-efeeba73e7ce" containerName="registry-server" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.991664 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6bwhl" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.995301 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.996013 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 14:18:56 crc kubenswrapper[4911]: I0310 14:18:56.996681 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xchz4" Mar 10 14:18:57 crc kubenswrapper[4911]: I0310 14:18:57.014353 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6bwhl"] Mar 10 14:18:57 crc kubenswrapper[4911]: I0310 14:18:57.189175 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcthg\" (UniqueName: \"kubernetes.io/projected/d835d70f-f1d3-4a4d-ad31-724535b95b0e-kube-api-access-vcthg\") pod \"openstack-operator-index-6bwhl\" (UID: \"d835d70f-f1d3-4a4d-ad31-724535b95b0e\") " pod="openstack-operators/openstack-operator-index-6bwhl" Mar 10 14:18:57 crc kubenswrapper[4911]: I0310 14:18:57.291157 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcthg\" (UniqueName: \"kubernetes.io/projected/d835d70f-f1d3-4a4d-ad31-724535b95b0e-kube-api-access-vcthg\") pod \"openstack-operator-index-6bwhl\" (UID: \"d835d70f-f1d3-4a4d-ad31-724535b95b0e\") " pod="openstack-operators/openstack-operator-index-6bwhl" Mar 10 14:18:57 crc kubenswrapper[4911]: I0310 14:18:57.320126 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcthg\" (UniqueName: \"kubernetes.io/projected/d835d70f-f1d3-4a4d-ad31-724535b95b0e-kube-api-access-vcthg\") pod \"openstack-operator-index-6bwhl\" (UID: \"d835d70f-f1d3-4a4d-ad31-724535b95b0e\") " pod="openstack-operators/openstack-operator-index-6bwhl" Mar 10 14:18:57 crc kubenswrapper[4911]: I0310 14:18:57.612828 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6bwhl" Mar 10 14:18:58 crc kubenswrapper[4911]: I0310 14:18:58.083876 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6bwhl"] Mar 10 14:18:58 crc kubenswrapper[4911]: W0310 14:18:58.093491 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd835d70f_f1d3_4a4d_ad31_724535b95b0e.slice/crio-410252260b5e522f2313d0f2d3819222434c1695f36a48256ccac9d837c394db WatchSource:0}: Error finding container 410252260b5e522f2313d0f2d3819222434c1695f36a48256ccac9d837c394db: Status 404 returned error can't find the container with id 410252260b5e522f2313d0f2d3819222434c1695f36a48256ccac9d837c394db Mar 10 14:18:58 crc kubenswrapper[4911]: I0310 14:18:58.423209 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6bwhl" event={"ID":"d835d70f-f1d3-4a4d-ad31-724535b95b0e","Type":"ContainerStarted","Data":"410252260b5e522f2313d0f2d3819222434c1695f36a48256ccac9d837c394db"} Mar 10 14:18:59 crc kubenswrapper[4911]: E0310 14:18:59.143358 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a43013_ecd0_4c18_a62b_efeeba73e7ce.slice\": RecentStats: unable to find data in memory cache]" Mar 10 14:19:00 crc kubenswrapper[4911]: I0310 14:19:00.165762 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6bwhl"] Mar 10 14:19:00 crc kubenswrapper[4911]: I0310 14:19:00.771040 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5f7w8"] Mar 10 14:19:00 crc kubenswrapper[4911]: I0310 14:19:00.772388 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:00 crc kubenswrapper[4911]: I0310 14:19:00.778767 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5f7w8"] Mar 10 14:19:00 crc kubenswrapper[4911]: I0310 14:19:00.947040 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-825dn\" (UniqueName: \"kubernetes.io/projected/2ccee19d-0e75-4358-87aa-16359f6bd2ee-kube-api-access-825dn\") pod \"openstack-operator-index-5f7w8\" (UID: \"2ccee19d-0e75-4358-87aa-16359f6bd2ee\") " pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.049008 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-825dn\" (UniqueName: \"kubernetes.io/projected/2ccee19d-0e75-4358-87aa-16359f6bd2ee-kube-api-access-825dn\") pod \"openstack-operator-index-5f7w8\" (UID: \"2ccee19d-0e75-4358-87aa-16359f6bd2ee\") " pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.076931 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-825dn\" (UniqueName: \"kubernetes.io/projected/2ccee19d-0e75-4358-87aa-16359f6bd2ee-kube-api-access-825dn\") pod \"openstack-operator-index-5f7w8\" (UID: \"2ccee19d-0e75-4358-87aa-16359f6bd2ee\") " pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.099146 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.447864 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6bwhl" event={"ID":"d835d70f-f1d3-4a4d-ad31-724535b95b0e","Type":"ContainerStarted","Data":"8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74"} Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.448925 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-6bwhl" podUID="d835d70f-f1d3-4a4d-ad31-724535b95b0e" containerName="registry-server" containerID="cri-o://8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74" gracePeriod=2 Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.469602 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6bwhl" podStartSLOduration=2.324088295 podStartE2EDuration="5.469575446s" podCreationTimestamp="2026-03-10 14:18:56 +0000 UTC" firstStartedPulling="2026-03-10 14:18:58.095843668 +0000 UTC m=+1042.659363585" lastFinishedPulling="2026-03-10 14:19:01.241330819 +0000 UTC m=+1045.804850736" observedRunningTime="2026-03-10 14:19:01.463040743 +0000 UTC m=+1046.026560660" watchObservedRunningTime="2026-03-10 14:19:01.469575446 +0000 UTC m=+1046.033095363" Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.640095 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5f7w8"] Mar 10 14:19:01 crc kubenswrapper[4911]: W0310 14:19:01.648085 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ccee19d_0e75_4358_87aa_16359f6bd2ee.slice/crio-3f42be6e1e25a3602c0cf632e0026f86f75a68fa6d7ea019154b59e34e476593 WatchSource:0}: Error finding container 3f42be6e1e25a3602c0cf632e0026f86f75a68fa6d7ea019154b59e34e476593: Status 404 returned error can't find the container with id 3f42be6e1e25a3602c0cf632e0026f86f75a68fa6d7ea019154b59e34e476593 Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.807752 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6bwhl_d835d70f-f1d3-4a4d-ad31-724535b95b0e/registry-server/0.log" Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.807859 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6bwhl" Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.964426 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcthg\" (UniqueName: \"kubernetes.io/projected/d835d70f-f1d3-4a4d-ad31-724535b95b0e-kube-api-access-vcthg\") pod \"d835d70f-f1d3-4a4d-ad31-724535b95b0e\" (UID: \"d835d70f-f1d3-4a4d-ad31-724535b95b0e\") " Mar 10 14:19:01 crc kubenswrapper[4911]: I0310 14:19:01.972904 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d835d70f-f1d3-4a4d-ad31-724535b95b0e-kube-api-access-vcthg" (OuterVolumeSpecName: "kube-api-access-vcthg") pod "d835d70f-f1d3-4a4d-ad31-724535b95b0e" (UID: "d835d70f-f1d3-4a4d-ad31-724535b95b0e"). InnerVolumeSpecName "kube-api-access-vcthg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.066410 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcthg\" (UniqueName: \"kubernetes.io/projected/d835d70f-f1d3-4a4d-ad31-724535b95b0e-kube-api-access-vcthg\") on node \"crc\" DevicePath \"\"" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.216015 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6pqxq" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.463413 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5f7w8" event={"ID":"2ccee19d-0e75-4358-87aa-16359f6bd2ee","Type":"ContainerStarted","Data":"61d3c87e5060b8c1a6b8992da92b81197d5a527ca0e698ee7d809adc33debeff"} Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.465021 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5f7w8" event={"ID":"2ccee19d-0e75-4358-87aa-16359f6bd2ee","Type":"ContainerStarted","Data":"3f42be6e1e25a3602c0cf632e0026f86f75a68fa6d7ea019154b59e34e476593"} Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.473457 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6bwhl_d835d70f-f1d3-4a4d-ad31-724535b95b0e/registry-server/0.log" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.473563 4911 generic.go:334] "Generic (PLEG): container finished" podID="d835d70f-f1d3-4a4d-ad31-724535b95b0e" containerID="8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74" exitCode=2 Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.473624 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6bwhl" event={"ID":"d835d70f-f1d3-4a4d-ad31-724535b95b0e","Type":"ContainerDied","Data":"8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74"} Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.473676 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6bwhl" event={"ID":"d835d70f-f1d3-4a4d-ad31-724535b95b0e","Type":"ContainerDied","Data":"410252260b5e522f2313d0f2d3819222434c1695f36a48256ccac9d837c394db"} Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.473715 4911 scope.go:117] "RemoveContainer" containerID="8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.474037 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6bwhl" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.509695 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5f7w8" podStartSLOduration=2.457979971 podStartE2EDuration="2.509659438s" podCreationTimestamp="2026-03-10 14:19:00 +0000 UTC" firstStartedPulling="2026-03-10 14:19:01.653425539 +0000 UTC m=+1046.216945446" lastFinishedPulling="2026-03-10 14:19:01.705104996 +0000 UTC m=+1046.268624913" observedRunningTime="2026-03-10 14:19:02.490502761 +0000 UTC m=+1047.054022678" watchObservedRunningTime="2026-03-10 14:19:02.509659438 +0000 UTC m=+1047.073179385" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.515102 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6bwhl"] Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.521379 4911 scope.go:117] "RemoveContainer" containerID="8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.521996 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-6bwhl"] Mar 10 14:19:02 crc kubenswrapper[4911]: E0310 14:19:02.522364 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74\": container with ID starting with 8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74 not found: ID does not exist" containerID="8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74" Mar 10 14:19:02 crc kubenswrapper[4911]: I0310 14:19:02.522554 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74"} err="failed to get container status \"8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74\": rpc error: code = NotFound desc = could not find container \"8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74\": container with ID starting with 8a80c6712e59c3c38bd56eb4221a9e2d14807e309bc1e34089051f71b2f08c74 not found: ID does not exist" Mar 10 14:19:04 crc kubenswrapper[4911]: I0310 14:19:04.209285 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d835d70f-f1d3-4a4d-ad31-724535b95b0e" path="/var/lib/kubelet/pods/d835d70f-f1d3-4a4d-ad31-724535b95b0e/volumes" Mar 10 14:19:09 crc kubenswrapper[4911]: E0310 14:19:09.347500 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a43013_ecd0_4c18_a62b_efeeba73e7ce.slice\": RecentStats: unable to find data in memory cache]" Mar 10 14:19:11 crc kubenswrapper[4911]: I0310 14:19:11.100380 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:11 crc kubenswrapper[4911]: I0310 14:19:11.100994 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:11 crc kubenswrapper[4911]: I0310 14:19:11.138257 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:11 crc kubenswrapper[4911]: I0310 14:19:11.587273 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-5f7w8" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.230767 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv"] Mar 10 14:19:12 crc kubenswrapper[4911]: E0310 14:19:12.231664 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d835d70f-f1d3-4a4d-ad31-724535b95b0e" containerName="registry-server" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.231695 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d835d70f-f1d3-4a4d-ad31-724535b95b0e" containerName="registry-server" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.232067 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d835d70f-f1d3-4a4d-ad31-724535b95b0e" containerName="registry-server" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.234571 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.248137 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9k5xm" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.257695 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv"] Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.356178 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqh7\" (UniqueName: \"kubernetes.io/projected/491a73fc-185e-46a0-815e-b1ec70061fc5-kube-api-access-xjqh7\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.356269 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-bundle\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.356345 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-util\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.458679 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-bundle\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.458800 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqh7\" (UniqueName: \"kubernetes.io/projected/491a73fc-185e-46a0-815e-b1ec70061fc5-kube-api-access-xjqh7\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.458914 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-util\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.459498 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-bundle\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.459708 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-util\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.479429 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqh7\" (UniqueName: \"kubernetes.io/projected/491a73fc-185e-46a0-815e-b1ec70061fc5-kube-api-access-xjqh7\") pod \"93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:12 crc kubenswrapper[4911]: I0310 14:19:12.622318 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:13 crc kubenswrapper[4911]: I0310 14:19:13.037509 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv"] Mar 10 14:19:13 crc kubenswrapper[4911]: I0310 14:19:13.576980 4911 generic.go:334] "Generic (PLEG): container finished" podID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerID="294aeba69002d198c0996897647f609769305da0fb1162e0265cff4a8f4ccfbc" exitCode=0 Mar 10 14:19:13 crc kubenswrapper[4911]: I0310 14:19:13.577088 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" event={"ID":"491a73fc-185e-46a0-815e-b1ec70061fc5","Type":"ContainerDied","Data":"294aeba69002d198c0996897647f609769305da0fb1162e0265cff4a8f4ccfbc"} Mar 10 14:19:13 crc kubenswrapper[4911]: I0310 14:19:13.577296 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" event={"ID":"491a73fc-185e-46a0-815e-b1ec70061fc5","Type":"ContainerStarted","Data":"a04e067a0787bec8cfef19f4d8701612e61bab4c815af25b3d5ab677e760f823"} Mar 10 14:19:14 crc kubenswrapper[4911]: I0310 14:19:14.589664 4911 generic.go:334] "Generic (PLEG): container finished" podID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerID="d38edfb4007a176989b49853b876ce607eefe088bec8dc9bf2a6ecf230a98c29" exitCode=0 Mar 10 14:19:14 crc kubenswrapper[4911]: I0310 14:19:14.589736 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" event={"ID":"491a73fc-185e-46a0-815e-b1ec70061fc5","Type":"ContainerDied","Data":"d38edfb4007a176989b49853b876ce607eefe088bec8dc9bf2a6ecf230a98c29"} Mar 10 14:19:15 crc kubenswrapper[4911]: I0310 14:19:15.600832 4911 generic.go:334] "Generic (PLEG): container finished" podID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerID="a5caea5f895d6e5eee4cb7a2f3eb517f5c76dc87784af29985a12bfae1c92783" exitCode=0 Mar 10 14:19:15 crc kubenswrapper[4911]: I0310 14:19:15.600905 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" event={"ID":"491a73fc-185e-46a0-815e-b1ec70061fc5","Type":"ContainerDied","Data":"a5caea5f895d6e5eee4cb7a2f3eb517f5c76dc87784af29985a12bfae1c92783"} Mar 10 14:19:16 crc kubenswrapper[4911]: I0310 14:19:16.795817 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:16 crc kubenswrapper[4911]: I0310 14:19:16.949027 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-util\") pod \"491a73fc-185e-46a0-815e-b1ec70061fc5\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " Mar 10 14:19:16 crc kubenswrapper[4911]: I0310 14:19:16.949092 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjqh7\" (UniqueName: \"kubernetes.io/projected/491a73fc-185e-46a0-815e-b1ec70061fc5-kube-api-access-xjqh7\") pod \"491a73fc-185e-46a0-815e-b1ec70061fc5\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " Mar 10 14:19:16 crc kubenswrapper[4911]: I0310 14:19:16.949158 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-bundle\") pod \"491a73fc-185e-46a0-815e-b1ec70061fc5\" (UID: \"491a73fc-185e-46a0-815e-b1ec70061fc5\") " Mar 10 14:19:16 crc kubenswrapper[4911]: I0310 14:19:16.949973 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-bundle" (OuterVolumeSpecName: "bundle") pod "491a73fc-185e-46a0-815e-b1ec70061fc5" (UID: "491a73fc-185e-46a0-815e-b1ec70061fc5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:19:16 crc kubenswrapper[4911]: I0310 14:19:16.954368 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491a73fc-185e-46a0-815e-b1ec70061fc5-kube-api-access-xjqh7" (OuterVolumeSpecName: "kube-api-access-xjqh7") pod "491a73fc-185e-46a0-815e-b1ec70061fc5" (UID: "491a73fc-185e-46a0-815e-b1ec70061fc5"). InnerVolumeSpecName "kube-api-access-xjqh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:19:16 crc kubenswrapper[4911]: I0310 14:19:16.961902 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-util" (OuterVolumeSpecName: "util") pod "491a73fc-185e-46a0-815e-b1ec70061fc5" (UID: "491a73fc-185e-46a0-815e-b1ec70061fc5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:19:17 crc kubenswrapper[4911]: I0310 14:19:17.050476 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-util\") on node \"crc\" DevicePath \"\"" Mar 10 14:19:17 crc kubenswrapper[4911]: I0310 14:19:17.050509 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjqh7\" (UniqueName: \"kubernetes.io/projected/491a73fc-185e-46a0-815e-b1ec70061fc5-kube-api-access-xjqh7\") on node \"crc\" DevicePath \"\"" Mar 10 14:19:17 crc kubenswrapper[4911]: I0310 14:19:17.050548 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491a73fc-185e-46a0-815e-b1ec70061fc5-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:19:17 crc kubenswrapper[4911]: I0310 14:19:17.616383 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" event={"ID":"491a73fc-185e-46a0-815e-b1ec70061fc5","Type":"ContainerDied","Data":"a04e067a0787bec8cfef19f4d8701612e61bab4c815af25b3d5ab677e760f823"} Mar 10 14:19:17 crc kubenswrapper[4911]: I0310 14:19:17.616451 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a04e067a0787bec8cfef19f4d8701612e61bab4c815af25b3d5ab677e760f823" Mar 10 14:19:17 crc kubenswrapper[4911]: I0310 14:19:17.616557 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv" Mar 10 14:19:19 crc kubenswrapper[4911]: E0310 14:19:19.503421 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a43013_ecd0_4c18_a62b_efeeba73e7ce.slice\": RecentStats: unable to find data in memory cache]" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.594451 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4"] Mar 10 14:19:24 crc kubenswrapper[4911]: E0310 14:19:24.595376 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerName="pull" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.595393 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerName="pull" Mar 10 14:19:24 crc kubenswrapper[4911]: E0310 14:19:24.595411 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerName="extract" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.595420 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerName="extract" Mar 10 14:19:24 crc kubenswrapper[4911]: E0310 14:19:24.595436 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerName="util" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.595446 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerName="util" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.595617 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="491a73fc-185e-46a0-815e-b1ec70061fc5" containerName="extract" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.596282 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.598541 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-s4ck6" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.617602 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4"] Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.764329 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwb4\" (UniqueName: \"kubernetes.io/projected/f34f11f5-13c7-426d-b30b-127ddd115a17-kube-api-access-rbwb4\") pod \"openstack-operator-controller-init-554774d6c8-bpzx4\" (UID: \"f34f11f5-13c7-426d-b30b-127ddd115a17\") " pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.866113 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwb4\" (UniqueName: \"kubernetes.io/projected/f34f11f5-13c7-426d-b30b-127ddd115a17-kube-api-access-rbwb4\") pod \"openstack-operator-controller-init-554774d6c8-bpzx4\" (UID: \"f34f11f5-13c7-426d-b30b-127ddd115a17\") " pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.895841 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwb4\" (UniqueName: \"kubernetes.io/projected/f34f11f5-13c7-426d-b30b-127ddd115a17-kube-api-access-rbwb4\") pod \"openstack-operator-controller-init-554774d6c8-bpzx4\" (UID: \"f34f11f5-13c7-426d-b30b-127ddd115a17\") " pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" Mar 10 14:19:24 crc kubenswrapper[4911]: I0310 14:19:24.917041 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" Mar 10 14:19:25 crc kubenswrapper[4911]: I0310 14:19:25.128307 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4"] Mar 10 14:19:25 crc kubenswrapper[4911]: I0310 14:19:25.675491 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" event={"ID":"f34f11f5-13c7-426d-b30b-127ddd115a17","Type":"ContainerStarted","Data":"fced4aae0f98a0d99d477b84a2efee8eaa611d28f94b97d43665f086ef1f56b8"} Mar 10 14:19:29 crc kubenswrapper[4911]: E0310 14:19:29.691766 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a43013_ecd0_4c18_a62b_efeeba73e7ce.slice\": RecentStats: unable to find data in memory cache]" Mar 10 14:19:30 crc kubenswrapper[4911]: I0310 14:19:30.738345 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" event={"ID":"f34f11f5-13c7-426d-b30b-127ddd115a17","Type":"ContainerStarted","Data":"fa9d2817831d6a90cece54a3f6550ca35902476d27772ca1d784cd7eae6236b9"} Mar 10 14:19:30 crc kubenswrapper[4911]: I0310 14:19:30.738761 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" Mar 10 14:19:30 crc kubenswrapper[4911]: I0310 14:19:30.791322 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" podStartSLOduration=1.611812573 podStartE2EDuration="6.791292255s" podCreationTimestamp="2026-03-10 14:19:24 +0000 UTC" firstStartedPulling="2026-03-10 14:19:25.135828482 +0000 UTC m=+1069.699348399" lastFinishedPulling="2026-03-10 14:19:30.315308164 +0000 UTC m=+1074.878828081" observedRunningTime="2026-03-10 14:19:30.78809101 +0000 UTC m=+1075.351610937" watchObservedRunningTime="2026-03-10 14:19:30.791292255 +0000 UTC m=+1075.354812172" Mar 10 14:19:44 crc kubenswrapper[4911]: I0310 14:19:44.921211 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-554774d6c8-bpzx4" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.161605 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552540-bbrwg"] Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.172849 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552540-bbrwg" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.178935 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.179051 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.179325 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.182692 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552540-bbrwg"] Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.339248 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcsws\" (UniqueName: \"kubernetes.io/projected/708857de-1db1-4764-8041-2bd173460cea-kube-api-access-hcsws\") pod \"auto-csr-approver-29552540-bbrwg\" (UID: \"708857de-1db1-4764-8041-2bd173460cea\") " pod="openshift-infra/auto-csr-approver-29552540-bbrwg" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.441094 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcsws\" (UniqueName: \"kubernetes.io/projected/708857de-1db1-4764-8041-2bd173460cea-kube-api-access-hcsws\") pod \"auto-csr-approver-29552540-bbrwg\" (UID: \"708857de-1db1-4764-8041-2bd173460cea\") " pod="openshift-infra/auto-csr-approver-29552540-bbrwg" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.467895 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcsws\" (UniqueName: \"kubernetes.io/projected/708857de-1db1-4764-8041-2bd173460cea-kube-api-access-hcsws\") pod \"auto-csr-approver-29552540-bbrwg\" (UID: \"708857de-1db1-4764-8041-2bd173460cea\") " pod="openshift-infra/auto-csr-approver-29552540-bbrwg" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.523794 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552540-bbrwg" Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.912625 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552540-bbrwg"] Mar 10 14:20:00 crc kubenswrapper[4911]: I0310 14:20:00.935700 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552540-bbrwg" event={"ID":"708857de-1db1-4764-8041-2bd173460cea","Type":"ContainerStarted","Data":"f2dd8d74f21312ca7582ff609384d50b649edc5710d606c6ea9cefd82e70cfc2"} Mar 10 14:20:03 crc kubenswrapper[4911]: I0310 14:20:03.974676 4911 generic.go:334] "Generic (PLEG): container finished" podID="708857de-1db1-4764-8041-2bd173460cea" containerID="79abc93edbe6f6458ffb43229595a67741637c5a8b2ad8aeeaf74ee842553c06" exitCode=0 Mar 10 14:20:03 crc kubenswrapper[4911]: I0310 14:20:03.974865 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552540-bbrwg" event={"ID":"708857de-1db1-4764-8041-2bd173460cea","Type":"ContainerDied","Data":"79abc93edbe6f6458ffb43229595a67741637c5a8b2ad8aeeaf74ee842553c06"} Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.291557 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552540-bbrwg" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.452078 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcsws\" (UniqueName: \"kubernetes.io/projected/708857de-1db1-4764-8041-2bd173460cea-kube-api-access-hcsws\") pod \"708857de-1db1-4764-8041-2bd173460cea\" (UID: \"708857de-1db1-4764-8041-2bd173460cea\") " Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.458905 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708857de-1db1-4764-8041-2bd173460cea-kube-api-access-hcsws" (OuterVolumeSpecName: "kube-api-access-hcsws") pod "708857de-1db1-4764-8041-2bd173460cea" (UID: "708857de-1db1-4764-8041-2bd173460cea"). InnerVolumeSpecName "kube-api-access-hcsws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.515061 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7"] Mar 10 14:20:05 crc kubenswrapper[4911]: E0310 14:20:05.515400 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708857de-1db1-4764-8041-2bd173460cea" containerName="oc" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.515413 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="708857de-1db1-4764-8041-2bd173460cea" containerName="oc" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.515577 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="708857de-1db1-4764-8041-2bd173460cea" containerName="oc" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.516128 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.518654 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5db5n" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.525091 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.526212 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.531321 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-slngz" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.539514 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.540771 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.553638 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-rngdx" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.554410 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcsws\" (UniqueName: \"kubernetes.io/projected/708857de-1db1-4764-8041-2bd173460cea-kube-api-access-hcsws\") on node \"crc\" DevicePath \"\"" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.563631 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.569852 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.571064 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.575304 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rklk6" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.584045 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.585023 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.589233 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6b5xj" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.589489 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.605222 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.615970 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.616884 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.629964 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.635211 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-5xpgj" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.651020 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.655424 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vvc\" (UniqueName: \"kubernetes.io/projected/c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298-kube-api-access-x7vvc\") pod \"designate-operator-controller-manager-66d56f6ff4-t2lgw\" (UID: \"c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.655981 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs99z\" (UniqueName: \"kubernetes.io/projected/a937a94a-14cb-4319-9147-d0ac60c5cc6a-kube-api-access-hs99z\") pod \"cinder-operator-controller-manager-984cd4dcf-hngxq\" (UID: \"a937a94a-14cb-4319-9147-d0ac60c5cc6a\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.656073 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4mkl\" (UniqueName: \"kubernetes.io/projected/0dabe548-2d6c-4bbb-8199-6403e57d2ac9-kube-api-access-t4mkl\") pod \"barbican-operator-controller-manager-677bd678f7-wv8p7\" (UID: \"0dabe548-2d6c-4bbb-8199-6403e57d2ac9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.658963 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.659889 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.667642 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.667844 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-w7xjt" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.669833 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.670941 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.677131 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vhs9q" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.681562 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.682687 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.688250 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2lcxb" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.696495 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.711539 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.721034 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.730938 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.749791 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.751565 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.756462 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lnjmc" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.758700 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcdnt\" (UniqueName: \"kubernetes.io/projected/e2b89cc3-8229-4401-b8e9-9a32bffb0f57-kube-api-access-zcdnt\") pod \"horizon-operator-controller-manager-6d9d6b584d-72kfl\" (UID: \"e2b89cc3-8229-4401-b8e9-9a32bffb0f57\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.758761 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs99z\" (UniqueName: \"kubernetes.io/projected/a937a94a-14cb-4319-9147-d0ac60c5cc6a-kube-api-access-hs99z\") pod \"cinder-operator-controller-manager-984cd4dcf-hngxq\" (UID: \"a937a94a-14cb-4319-9147-d0ac60c5cc6a\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.758792 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.758811 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4mkl\" (UniqueName: \"kubernetes.io/projected/0dabe548-2d6c-4bbb-8199-6403e57d2ac9-kube-api-access-t4mkl\") pod \"barbican-operator-controller-manager-677bd678f7-wv8p7\" (UID: \"0dabe548-2d6c-4bbb-8199-6403e57d2ac9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.758848 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vvc\" (UniqueName: \"kubernetes.io/projected/c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298-kube-api-access-x7vvc\") pod \"designate-operator-controller-manager-66d56f6ff4-t2lgw\" (UID: \"c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.758866 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prw8\" (UniqueName: \"kubernetes.io/projected/c5336054-5038-40f7-8512-9fe34269f6cd-kube-api-access-8prw8\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.758893 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlmzp\" (UniqueName: \"kubernetes.io/projected/6a0bd4c9-4420-48be-9637-67ea2b5c89d1-kube-api-access-rlmzp\") pod \"glance-operator-controller-manager-5964f64c48-z5pz7\" (UID: \"6a0bd4c9-4420-48be-9637-67ea2b5c89d1\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.758912 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgbvd\" (UniqueName: \"kubernetes.io/projected/14dd9547-ff92-4cb4-a055-e41fd390e90e-kube-api-access-vgbvd\") pod \"heat-operator-controller-manager-77b6666d85-8mqrm\" (UID: \"14dd9547-ff92-4cb4-a055-e41fd390e90e\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.788168 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.790151 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.801718 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-l8dlv" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.832036 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs99z\" (UniqueName: \"kubernetes.io/projected/a937a94a-14cb-4319-9147-d0ac60c5cc6a-kube-api-access-hs99z\") pod \"cinder-operator-controller-manager-984cd4dcf-hngxq\" (UID: \"a937a94a-14cb-4319-9147-d0ac60c5cc6a\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.832046 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vvc\" (UniqueName: \"kubernetes.io/projected/c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298-kube-api-access-x7vvc\") pod \"designate-operator-controller-manager-66d56f6ff4-t2lgw\" (UID: \"c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.836448 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.836823 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4mkl\" (UniqueName: \"kubernetes.io/projected/0dabe548-2d6c-4bbb-8199-6403e57d2ac9-kube-api-access-t4mkl\") pod \"barbican-operator-controller-manager-677bd678f7-wv8p7\" (UID: \"0dabe548-2d6c-4bbb-8199-6403e57d2ac9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.840570 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.855547 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.856830 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.860649 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8prw8\" (UniqueName: \"kubernetes.io/projected/c5336054-5038-40f7-8512-9fe34269f6cd-kube-api-access-8prw8\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.860707 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlmzp\" (UniqueName: \"kubernetes.io/projected/6a0bd4c9-4420-48be-9637-67ea2b5c89d1-kube-api-access-rlmzp\") pod \"glance-operator-controller-manager-5964f64c48-z5pz7\" (UID: \"6a0bd4c9-4420-48be-9637-67ea2b5c89d1\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.860757 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgbvd\" (UniqueName: \"kubernetes.io/projected/14dd9547-ff92-4cb4-a055-e41fd390e90e-kube-api-access-vgbvd\") pod \"heat-operator-controller-manager-77b6666d85-8mqrm\" (UID: \"14dd9547-ff92-4cb4-a055-e41fd390e90e\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.860805 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbfjz\" (UniqueName: \"kubernetes.io/projected/06a5238b-e7e1-49a5-9bb8-5f6162183a13-kube-api-access-zbfjz\") pod \"ironic-operator-controller-manager-6bbb499bbc-t2qf9\" (UID: \"06a5238b-e7e1-49a5-9bb8-5f6162183a13\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.860843 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cf65\" (UniqueName: \"kubernetes.io/projected/998d9bc8-11c1-4967-b3d9-1c823d6c41d6-kube-api-access-6cf65\") pod \"keystone-operator-controller-manager-684f77d66d-xw7bn\" (UID: \"998d9bc8-11c1-4967-b3d9-1c823d6c41d6\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.860868 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcdnt\" (UniqueName: \"kubernetes.io/projected/e2b89cc3-8229-4401-b8e9-9a32bffb0f57-kube-api-access-zcdnt\") pod \"horizon-operator-controller-manager-6d9d6b584d-72kfl\" (UID: \"e2b89cc3-8229-4401-b8e9-9a32bffb0f57\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.860899 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnbg\" (UniqueName: \"kubernetes.io/projected/90f49412-d2c3-46ba-9591-5adee9624834-kube-api-access-psnbg\") pod \"manila-operator-controller-manager-68f45f9d9f-8mq9k\" (UID: \"90f49412-d2c3-46ba-9591-5adee9624834\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.860928 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:05 crc kubenswrapper[4911]: E0310 14:20:05.861066 4911 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:05 crc kubenswrapper[4911]: E0310 14:20:05.861130 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert podName:c5336054-5038-40f7-8512-9fe34269f6cd nodeName:}" failed. No retries permitted until 2026-03-10 14:20:06.361107826 +0000 UTC m=+1110.924627743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert") pod "infra-operator-controller-manager-5995f4446f-jjsfs" (UID: "c5336054-5038-40f7-8512-9fe34269f6cd") : secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.867101 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.884843 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.886696 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.896782 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.896891 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mhh6j" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.902105 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.903889 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.910858 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prw8\" (UniqueName: \"kubernetes.io/projected/c5336054-5038-40f7-8512-9fe34269f6cd-kube-api-access-8prw8\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.910950 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-57mls"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.911508 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-27c2t" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.911971 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.914922 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-t2f8s" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.932471 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlmzp\" (UniqueName: \"kubernetes.io/projected/6a0bd4c9-4420-48be-9637-67ea2b5c89d1-kube-api-access-rlmzp\") pod \"glance-operator-controller-manager-5964f64c48-z5pz7\" (UID: \"6a0bd4c9-4420-48be-9637-67ea2b5c89d1\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.933201 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgbvd\" (UniqueName: \"kubernetes.io/projected/14dd9547-ff92-4cb4-a055-e41fd390e90e-kube-api-access-vgbvd\") pod \"heat-operator-controller-manager-77b6666d85-8mqrm\" (UID: \"14dd9547-ff92-4cb4-a055-e41fd390e90e\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.939169 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.943792 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcdnt\" (UniqueName: \"kubernetes.io/projected/e2b89cc3-8229-4401-b8e9-9a32bffb0f57-kube-api-access-zcdnt\") pod \"horizon-operator-controller-manager-6d9d6b584d-72kfl\" (UID: \"e2b89cc3-8229-4401-b8e9-9a32bffb0f57\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.953251 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.958596 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-57mls"] Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.966204 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psnbg\" (UniqueName: \"kubernetes.io/projected/90f49412-d2c3-46ba-9591-5adee9624834-kube-api-access-psnbg\") pod \"manila-operator-controller-manager-68f45f9d9f-8mq9k\" (UID: \"90f49412-d2c3-46ba-9591-5adee9624834\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.966267 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6vg\" (UniqueName: \"kubernetes.io/projected/c5080d31-7711-4e4a-9902-4843929a16e9-kube-api-access-tc6vg\") pod \"nova-operator-controller-manager-569cc54c5-57mls\" (UID: \"c5080d31-7711-4e4a-9902-4843929a16e9\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.966292 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l545\" (UniqueName: \"kubernetes.io/projected/7c70b1a5-051f-43b5-80a3-1b462b9a50f8-kube-api-access-4l545\") pod \"neutron-operator-controller-manager-776c5696bf-5v9fq\" (UID: \"7c70b1a5-051f-43b5-80a3-1b462b9a50f8\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.966319 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zbj4\" (UniqueName: \"kubernetes.io/projected/5cf94e3e-325d-4364-bf70-c479683b2be6-kube-api-access-2zbj4\") pod \"octavia-operator-controller-manager-5f4f55cb5c-cqdch\" (UID: \"5cf94e3e-325d-4364-bf70-c479683b2be6\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.966364 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbfjz\" (UniqueName: \"kubernetes.io/projected/06a5238b-e7e1-49a5-9bb8-5f6162183a13-kube-api-access-zbfjz\") pod \"ironic-operator-controller-manager-6bbb499bbc-t2qf9\" (UID: \"06a5238b-e7e1-49a5-9bb8-5f6162183a13\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.966386 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4z9n\" (UniqueName: \"kubernetes.io/projected/53d2a376-b957-4875-8bfe-42d5dbc0a634-kube-api-access-m4z9n\") pod \"mariadb-operator-controller-manager-658d4cdd5-7pcfv\" (UID: \"53d2a376-b957-4875-8bfe-42d5dbc0a634\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.966412 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cf65\" (UniqueName: \"kubernetes.io/projected/998d9bc8-11c1-4967-b3d9-1c823d6c41d6-kube-api-access-6cf65\") pod \"keystone-operator-controller-manager-684f77d66d-xw7bn\" (UID: \"998d9bc8-11c1-4967-b3d9-1c823d6c41d6\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.995559 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbfjz\" (UniqueName: \"kubernetes.io/projected/06a5238b-e7e1-49a5-9bb8-5f6162183a13-kube-api-access-zbfjz\") pod \"ironic-operator-controller-manager-6bbb499bbc-t2qf9\" (UID: \"06a5238b-e7e1-49a5-9bb8-5f6162183a13\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.995594 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnbg\" (UniqueName: \"kubernetes.io/projected/90f49412-d2c3-46ba-9591-5adee9624834-kube-api-access-psnbg\") pod \"manila-operator-controller-manager-68f45f9d9f-8mq9k\" (UID: \"90f49412-d2c3-46ba-9591-5adee9624834\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" Mar 10 14:20:05 crc kubenswrapper[4911]: I0310 14:20:05.996078 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:05.999102 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.003667 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cf65\" (UniqueName: \"kubernetes.io/projected/998d9bc8-11c1-4967-b3d9-1c823d6c41d6-kube-api-access-6cf65\") pod \"keystone-operator-controller-manager-684f77d66d-xw7bn\" (UID: \"998d9bc8-11c1-4967-b3d9-1c823d6c41d6\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.004555 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.004599 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nhpnk" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.020940 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.029061 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552540-bbrwg" event={"ID":"708857de-1db1-4764-8041-2bd173460cea","Type":"ContainerDied","Data":"f2dd8d74f21312ca7582ff609384d50b649edc5710d606c6ea9cefd82e70cfc2"} Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.029105 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2dd8d74f21312ca7582ff609384d50b649edc5710d606c6ea9cefd82e70cfc2" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.029193 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552540-bbrwg" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.043324 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.044006 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.047745 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.054262 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-cxj79" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.059980 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.067451 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4z9n\" (UniqueName: \"kubernetes.io/projected/53d2a376-b957-4875-8bfe-42d5dbc0a634-kube-api-access-m4z9n\") pod \"mariadb-operator-controller-manager-658d4cdd5-7pcfv\" (UID: \"53d2a376-b957-4875-8bfe-42d5dbc0a634\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.067553 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6vg\" (UniqueName: \"kubernetes.io/projected/c5080d31-7711-4e4a-9902-4843929a16e9-kube-api-access-tc6vg\") pod \"nova-operator-controller-manager-569cc54c5-57mls\" (UID: \"c5080d31-7711-4e4a-9902-4843929a16e9\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.067586 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l545\" (UniqueName: \"kubernetes.io/projected/7c70b1a5-051f-43b5-80a3-1b462b9a50f8-kube-api-access-4l545\") pod \"neutron-operator-controller-manager-776c5696bf-5v9fq\" (UID: \"7c70b1a5-051f-43b5-80a3-1b462b9a50f8\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.067612 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zbj4\" (UniqueName: \"kubernetes.io/projected/5cf94e3e-325d-4364-bf70-c479683b2be6-kube-api-access-2zbj4\") pod \"octavia-operator-controller-manager-5f4f55cb5c-cqdch\" (UID: \"5cf94e3e-325d-4364-bf70-c479683b2be6\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.094770 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.107336 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4z9n\" (UniqueName: \"kubernetes.io/projected/53d2a376-b957-4875-8bfe-42d5dbc0a634-kube-api-access-m4z9n\") pod \"mariadb-operator-controller-manager-658d4cdd5-7pcfv\" (UID: \"53d2a376-b957-4875-8bfe-42d5dbc0a634\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.108775 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zbj4\" (UniqueName: \"kubernetes.io/projected/5cf94e3e-325d-4364-bf70-c479683b2be6-kube-api-access-2zbj4\") pod \"octavia-operator-controller-manager-5f4f55cb5c-cqdch\" (UID: \"5cf94e3e-325d-4364-bf70-c479683b2be6\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.114396 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6vg\" (UniqueName: \"kubernetes.io/projected/c5080d31-7711-4e4a-9902-4843929a16e9-kube-api-access-tc6vg\") pod \"nova-operator-controller-manager-569cc54c5-57mls\" (UID: \"c5080d31-7711-4e4a-9902-4843929a16e9\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.126439 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l545\" (UniqueName: \"kubernetes.io/projected/7c70b1a5-051f-43b5-80a3-1b462b9a50f8-kube-api-access-4l545\") pod \"neutron-operator-controller-manager-776c5696bf-5v9fq\" (UID: \"7c70b1a5-051f-43b5-80a3-1b462b9a50f8\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.148619 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.150140 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.154518 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-r5gkx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.163181 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.168439 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8k5b\" (UniqueName: \"kubernetes.io/projected/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-kube-api-access-c8k5b\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.168837 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.169015 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6f28\" (UniqueName: \"kubernetes.io/projected/a036efe0-e6cc-4ebe-8b06-70bc180b7b1c-kube-api-access-h6f28\") pod \"ovn-operator-controller-manager-bbc5b68f9-2mnh2\" (UID: \"a036efe0-e6cc-4ebe-8b06-70bc180b7b1c\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.178561 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.180553 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.188561 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.189593 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qnzqv" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.194354 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.195258 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.235166 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.236832 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.241025 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.247678 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5zllg" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.256165 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.269615 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.272697 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.272747 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6f28\" (UniqueName: \"kubernetes.io/projected/a036efe0-e6cc-4ebe-8b06-70bc180b7b1c-kube-api-access-h6f28\") pod \"ovn-operator-controller-manager-bbc5b68f9-2mnh2\" (UID: \"a036efe0-e6cc-4ebe-8b06-70bc180b7b1c\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.272799 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsvq\" (UniqueName: \"kubernetes.io/projected/a07393e2-b210-4e68-8cd3-62a838f86071-kube-api-access-bqsvq\") pod \"placement-operator-controller-manager-574d45c66c-khnvw\" (UID: \"a07393e2-b210-4e68-8cd3-62a838f86071\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.272849 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbwdc\" (UniqueName: \"kubernetes.io/projected/fca0a377-f77c-4e24-aec1-8ffb8ba87963-kube-api-access-nbwdc\") pod \"swift-operator-controller-manager-677c674df7-8jk8x\" (UID: \"fca0a377-f77c-4e24-aec1-8ffb8ba87963\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.272886 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8k5b\" (UniqueName: \"kubernetes.io/projected/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-kube-api-access-c8k5b\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.273621 4911 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.273753 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert podName:f82c1a17-4dc8-48c2-9bc2-9d7168524de3 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:06.773698379 +0000 UTC m=+1111.337218296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" (UID: "f82c1a17-4dc8-48c2-9bc2-9d7168524de3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.292138 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.347001 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.368521 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6f28\" (UniqueName: \"kubernetes.io/projected/a036efe0-e6cc-4ebe-8b06-70bc180b7b1c-kube-api-access-h6f28\") pod \"ovn-operator-controller-manager-bbc5b68f9-2mnh2\" (UID: \"a036efe0-e6cc-4ebe-8b06-70bc180b7b1c\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.370345 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.377689 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8k5b\" (UniqueName: \"kubernetes.io/projected/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-kube-api-access-c8k5b\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.385029 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsvq\" (UniqueName: \"kubernetes.io/projected/a07393e2-b210-4e68-8cd3-62a838f86071-kube-api-access-bqsvq\") pod \"placement-operator-controller-manager-574d45c66c-khnvw\" (UID: \"a07393e2-b210-4e68-8cd3-62a838f86071\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.385164 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.385203 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbwdc\" (UniqueName: \"kubernetes.io/projected/fca0a377-f77c-4e24-aec1-8ffb8ba87963-kube-api-access-nbwdc\") pod \"swift-operator-controller-manager-677c674df7-8jk8x\" (UID: \"fca0a377-f77c-4e24-aec1-8ffb8ba87963\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.385345 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdlgw\" (UniqueName: \"kubernetes.io/projected/c8487a91-d6ca-480d-a451-35e6516bc9e8-kube-api-access-qdlgw\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-pgf4d\" (UID: \"c8487a91-d6ca-480d-a451-35e6516bc9e8\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.415884 4911 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.415958 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert podName:c5336054-5038-40f7-8512-9fe34269f6cd nodeName:}" failed. No retries permitted until 2026-03-10 14:20:07.415935141 +0000 UTC m=+1111.979455058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert") pod "infra-operator-controller-manager-5995f4446f-jjsfs" (UID: "c5336054-5038-40f7-8512-9fe34269f6cd") : secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.435145 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.436479 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsvq\" (UniqueName: \"kubernetes.io/projected/a07393e2-b210-4e68-8cd3-62a838f86071-kube-api-access-bqsvq\") pod \"placement-operator-controller-manager-574d45c66c-khnvw\" (UID: \"a07393e2-b210-4e68-8cd3-62a838f86071\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.452651 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbwdc\" (UniqueName: \"kubernetes.io/projected/fca0a377-f77c-4e24-aec1-8ffb8ba87963-kube-api-access-nbwdc\") pod \"swift-operator-controller-manager-677c674df7-8jk8x\" (UID: \"fca0a377-f77c-4e24-aec1-8ffb8ba87963\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.489651 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdlgw\" (UniqueName: \"kubernetes.io/projected/c8487a91-d6ca-480d-a451-35e6516bc9e8-kube-api-access-qdlgw\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-pgf4d\" (UID: \"c8487a91-d6ca-480d-a451-35e6516bc9e8\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.499870 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.501021 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.505669 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9j5k4" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.511974 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.528576 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdlgw\" (UniqueName: \"kubernetes.io/projected/c8487a91-d6ca-480d-a451-35e6516bc9e8-kube-api-access-qdlgw\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-pgf4d\" (UID: \"c8487a91-d6ca-480d-a451-35e6516bc9e8\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.529435 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.540136 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.548763 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.549811 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.557811 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.565183 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4vcf7" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.592005 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx4bb\" (UniqueName: \"kubernetes.io/projected/902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6-kube-api-access-tx4bb\") pod \"test-operator-controller-manager-5c5cb9c4d7-wltgc\" (UID: \"902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.610846 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.611858 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.618464 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fsg9z" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.618988 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.619098 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.632261 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.633821 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552534-8wvx4"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.646800 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552534-8wvx4"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.665537 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.667110 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.669676 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jjx96" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.673023 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.693923 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx4bb\" (UniqueName: \"kubernetes.io/projected/902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6-kube-api-access-tx4bb\") pod \"test-operator-controller-manager-5c5cb9c4d7-wltgc\" (UID: \"902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.694017 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8db7v\" (UniqueName: \"kubernetes.io/projected/ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20-kube-api-access-8db7v\") pod \"watcher-operator-controller-manager-6dd88c6f67-fmrnr\" (UID: \"ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.694056 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsnq9\" (UniqueName: \"kubernetes.io/projected/0fcc8b66-2a29-45c8-a445-a14770e3f157-kube-api-access-dsnq9\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.694091 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.694150 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.717844 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx4bb\" (UniqueName: \"kubernetes.io/projected/902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6-kube-api-access-tx4bb\") pod \"test-operator-controller-manager-5c5cb9c4d7-wltgc\" (UID: \"902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.726062 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.765457 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.798011 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8db7v\" (UniqueName: \"kubernetes.io/projected/ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20-kube-api-access-8db7v\") pod \"watcher-operator-controller-manager-6dd88c6f67-fmrnr\" (UID: \"ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.798059 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsnq9\" (UniqueName: \"kubernetes.io/projected/0fcc8b66-2a29-45c8-a445-a14770e3f157-kube-api-access-dsnq9\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.798107 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.798148 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.798182 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.798260 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxgn2\" (UniqueName: \"kubernetes.io/projected/0b94f7c5-35a4-430f-bccb-011f386954d5-kube-api-access-nxgn2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7n8pz\" (UID: \"0b94f7c5-35a4-430f-bccb-011f386954d5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.798501 4911 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.798559 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:07.298537712 +0000 UTC m=+1111.862057639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "metrics-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.798674 4911 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.798710 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert podName:f82c1a17-4dc8-48c2-9bc2-9d7168524de3 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:07.798699676 +0000 UTC m=+1112.362219593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" (UID: "f82c1a17-4dc8-48c2-9bc2-9d7168524de3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.798777 4911 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: E0310 14:20:06.798805 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:07.298796509 +0000 UTC m=+1111.862316426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "webhook-server-cert" not found Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.841697 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsnq9\" (UniqueName: \"kubernetes.io/projected/0fcc8b66-2a29-45c8-a445-a14770e3f157-kube-api-access-dsnq9\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.846960 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.852643 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8db7v\" (UniqueName: \"kubernetes.io/projected/ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20-kube-api-access-8db7v\") pod \"watcher-operator-controller-manager-6dd88c6f67-fmrnr\" (UID: \"ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.900635 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxgn2\" (UniqueName: \"kubernetes.io/projected/0b94f7c5-35a4-430f-bccb-011f386954d5-kube-api-access-nxgn2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7n8pz\" (UID: \"0b94f7c5-35a4-430f-bccb-011f386954d5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.924058 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl"] Mar 10 14:20:06 crc kubenswrapper[4911]: I0310 14:20:06.939105 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxgn2\" (UniqueName: \"kubernetes.io/projected/0b94f7c5-35a4-430f-bccb-011f386954d5-kube-api-access-nxgn2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7n8pz\" (UID: \"0b94f7c5-35a4-430f-bccb-011f386954d5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" Mar 10 14:20:06 crc kubenswrapper[4911]: W0310 14:20:06.939835 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2b89cc3_8229_4401_b8e9_9a32bffb0f57.slice/crio-a2a20e084eba8c4d85e0aa03511fa056fc30fdb629573f49f7cd6c9bd20b1376 WatchSource:0}: Error finding container a2a20e084eba8c4d85e0aa03511fa056fc30fdb629573f49f7cd6c9bd20b1376: Status 404 returned error can't find the container with id a2a20e084eba8c4d85e0aa03511fa056fc30fdb629573f49f7cd6c9bd20b1376 Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.028184 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.031597 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.081916 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" Mar 10 14:20:07 crc kubenswrapper[4911]: W0310 14:20:07.158124 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda937a94a_14cb_4319_9147_d0ac60c5cc6a.slice/crio-1eedf612ef1f1165afe202f139c9480bc063b0d5f93aaa3b6fadaa60bd56f562 WatchSource:0}: Error finding container 1eedf612ef1f1165afe202f139c9480bc063b0d5f93aaa3b6fadaa60bd56f562: Status 404 returned error can't find the container with id 1eedf612ef1f1165afe202f139c9480bc063b0d5f93aaa3b6fadaa60bd56f562 Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.181093 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" event={"ID":"e2b89cc3-8229-4401-b8e9-9a32bffb0f57","Type":"ContainerStarted","Data":"a2a20e084eba8c4d85e0aa03511fa056fc30fdb629573f49f7cd6c9bd20b1376"} Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.190373 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.194957 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" event={"ID":"c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298","Type":"ContainerStarted","Data":"65833e8ff428806ba99509ae143d9c3727aa1743afa5cb38b73aee57a3fdbad3"} Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.263554 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.319695 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.319792 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:07 crc kubenswrapper[4911]: E0310 14:20:07.321399 4911 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 14:20:07 crc kubenswrapper[4911]: E0310 14:20:07.321463 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:08.321437523 +0000 UTC m=+1112.884957440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "metrics-server-cert" not found Mar 10 14:20:07 crc kubenswrapper[4911]: E0310 14:20:07.325924 4911 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 14:20:07 crc kubenswrapper[4911]: E0310 14:20:07.326033 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:08.326007124 +0000 UTC m=+1112.889527041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "webhook-server-cert" not found Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.426583 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:07 crc kubenswrapper[4911]: E0310 14:20:07.426856 4911 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:07 crc kubenswrapper[4911]: E0310 14:20:07.426920 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert podName:c5336054-5038-40f7-8512-9fe34269f6cd nodeName:}" failed. No retries permitted until 2026-03-10 14:20:09.426900453 +0000 UTC m=+1113.990420370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert") pod "infra-operator-controller-manager-5995f4446f-jjsfs" (UID: "c5336054-5038-40f7-8512-9fe34269f6cd") : secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.440465 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.548995 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.584638 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k"] Mar 10 14:20:07 crc kubenswrapper[4911]: W0310 14:20:07.633587 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90f49412_d2c3_46ba_9591_5adee9624834.slice/crio-55cfab18ec9007f462335d2b1cd55ce658e20183c558b78ee0fbeee0ace770be WatchSource:0}: Error finding container 55cfab18ec9007f462335d2b1cd55ce658e20183c558b78ee0fbeee0ace770be: Status 404 returned error can't find the container with id 55cfab18ec9007f462335d2b1cd55ce658e20183c558b78ee0fbeee0ace770be Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.647799 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.816293 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-57mls"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.827132 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.833037 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:07 crc kubenswrapper[4911]: E0310 14:20:07.833274 4911 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:07 crc kubenswrapper[4911]: E0310 14:20:07.833378 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert podName:f82c1a17-4dc8-48c2-9bc2-9d7168524de3 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:09.833350354 +0000 UTC m=+1114.396870271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" (UID: "f82c1a17-4dc8-48c2-9bc2-9d7168524de3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.941831 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq"] Mar 10 14:20:07 crc kubenswrapper[4911]: W0310 14:20:07.946571 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c70b1a5_051f_43b5_80a3_1b462b9a50f8.slice/crio-fe9fe50ee501fe009a49b7f0f9d407bfe85f884cfeaf696f6034f27fcd35483c WatchSource:0}: Error finding container fe9fe50ee501fe009a49b7f0f9d407bfe85f884cfeaf696f6034f27fcd35483c: Status 404 returned error can't find the container with id fe9fe50ee501fe009a49b7f0f9d407bfe85f884cfeaf696f6034f27fcd35483c Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.948109 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch"] Mar 10 14:20:07 crc kubenswrapper[4911]: I0310 14:20:07.971312 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x"] Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.054289 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc"] Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.071164 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7"] Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.090954 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2"] Mar 10 14:20:08 crc kubenswrapper[4911]: W0310 14:20:08.094454 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod902c813a_1cba_4e57_9d1e_0e0a8ab0f6d6.slice/crio-7f0c5a88eb1f65fe527c117d5764fbf414a44216f7a85db224fcba6824d245bd WatchSource:0}: Error finding container 7f0c5a88eb1f65fe527c117d5764fbf414a44216f7a85db224fcba6824d245bd: Status 404 returned error can't find the container with id 7f0c5a88eb1f65fe527c117d5764fbf414a44216f7a85db224fcba6824d245bd Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.097814 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d"] Mar 10 14:20:08 crc kubenswrapper[4911]: W0310 14:20:08.124502 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8487a91_d6ca_480d_a451_35e6516bc9e8.slice/crio-f631e8b692285bc07be6a3ad26bb70c6e2b6af03257469a2c046bc7aabfbbc47 WatchSource:0}: Error finding container f631e8b692285bc07be6a3ad26bb70c6e2b6af03257469a2c046bc7aabfbbc47: Status 404 returned error can't find the container with id f631e8b692285bc07be6a3ad26bb70c6e2b6af03257469a2c046bc7aabfbbc47 Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.130945 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h6f28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-2mnh2_openstack-operators(a036efe0-e6cc-4ebe-8b06-70bc180b7b1c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.131133 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qdlgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-pgf4d_openstack-operators(c8487a91-d6ca-480d-a451-35e6516bc9e8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.132113 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" podUID="a036efe0-e6cc-4ebe-8b06-70bc180b7b1c" Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.132357 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" podUID="c8487a91-d6ca-480d-a451-35e6516bc9e8" Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.158331 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz"] Mar 10 14:20:08 crc kubenswrapper[4911]: W0310 14:20:08.161739 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b94f7c5_35a4_430f_bccb_011f386954d5.slice/crio-43a06f603c1fb184467e2ace3a862636b3d411235aedaf25bad1ef1a240dd1d7 WatchSource:0}: Error finding container 43a06f603c1fb184467e2ace3a862636b3d411235aedaf25bad1ef1a240dd1d7: Status 404 returned error can't find the container with id 43a06f603c1fb184467e2ace3a862636b3d411235aedaf25bad1ef1a240dd1d7 Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.172889 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr"] Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.174657 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nxgn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7n8pz_openstack-operators(0b94f7c5-35a4-430f-bccb-011f386954d5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.176369 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" podUID="0b94f7c5-35a4-430f-bccb-011f386954d5" Mar 10 14:20:08 crc kubenswrapper[4911]: W0310 14:20:08.190170 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef5ab7b9_910d_4c79_9f03_ad4ce9fc6a20.slice/crio-230c3bdaa1f5ee72b7664e53e2b40ad4abb0b5684c1043635cd31aebd8c2288d WatchSource:0}: Error finding container 230c3bdaa1f5ee72b7664e53e2b40ad4abb0b5684c1043635cd31aebd8c2288d: Status 404 returned error can't find the container with id 230c3bdaa1f5ee72b7664e53e2b40ad4abb0b5684c1043635cd31aebd8c2288d Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.194464 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8db7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-fmrnr_openstack-operators(ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.197043 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" podUID="ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20" Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.214576 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef59c911-45bc-4848-8c29-f63b38053e1d" path="/var/lib/kubelet/pods/ef59c911-45bc-4848-8c29-f63b38053e1d/volumes" Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.221819 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" event={"ID":"a036efe0-e6cc-4ebe-8b06-70bc180b7b1c","Type":"ContainerStarted","Data":"839b2cce16620211e180a9c82ab525f7aaf6c822320adbc512e0a713d5b37841"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.223290 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" event={"ID":"53d2a376-b957-4875-8bfe-42d5dbc0a634","Type":"ContainerStarted","Data":"231bdd2f725d05d0affe70fa487fcfefec794000f1b162bbd1550f407bda2f72"} Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.224694 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" podUID="a036efe0-e6cc-4ebe-8b06-70bc180b7b1c" Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.225976 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" event={"ID":"06a5238b-e7e1-49a5-9bb8-5f6162183a13","Type":"ContainerStarted","Data":"3bc1a9d600b38bbb5c8581f198e62ef80ecf8811b41d04860c02ef1d1ee6f243"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.227681 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" event={"ID":"ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20","Type":"ContainerStarted","Data":"230c3bdaa1f5ee72b7664e53e2b40ad4abb0b5684c1043635cd31aebd8c2288d"} Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.228947 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" podUID="ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20" Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.236087 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" event={"ID":"c8487a91-d6ca-480d-a451-35e6516bc9e8","Type":"ContainerStarted","Data":"f631e8b692285bc07be6a3ad26bb70c6e2b6af03257469a2c046bc7aabfbbc47"} Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.246425 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" podUID="c8487a91-d6ca-480d-a451-35e6516bc9e8" Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.247471 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" event={"ID":"90f49412-d2c3-46ba-9591-5adee9624834","Type":"ContainerStarted","Data":"55cfab18ec9007f462335d2b1cd55ce658e20183c558b78ee0fbeee0ace770be"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.256219 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" event={"ID":"0dabe548-2d6c-4bbb-8199-6403e57d2ac9","Type":"ContainerStarted","Data":"e6d241455f020b2f2b94f30a96eb2c46288e2d1fc19e7dacd0153e2cdf34816d"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.258898 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" event={"ID":"902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6","Type":"ContainerStarted","Data":"7f0c5a88eb1f65fe527c117d5764fbf414a44216f7a85db224fcba6824d245bd"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.273032 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" event={"ID":"14dd9547-ff92-4cb4-a055-e41fd390e90e","Type":"ContainerStarted","Data":"cb0c4707da4702fb5d3e1b962884bd23adc9f16371a8863ebfea31d31f8a8504"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.289296 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" event={"ID":"5cf94e3e-325d-4364-bf70-c479683b2be6","Type":"ContainerStarted","Data":"99cb76b5d697a228d25294fdf96007defcd4a22bdf5424fe83cbce44a3a80f03"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.294168 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" event={"ID":"0b94f7c5-35a4-430f-bccb-011f386954d5","Type":"ContainerStarted","Data":"43a06f603c1fb184467e2ace3a862636b3d411235aedaf25bad1ef1a240dd1d7"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.295926 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" event={"ID":"a937a94a-14cb-4319-9147-d0ac60c5cc6a","Type":"ContainerStarted","Data":"1eedf612ef1f1165afe202f139c9480bc063b0d5f93aaa3b6fadaa60bd56f562"} Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.297450 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" podUID="0b94f7c5-35a4-430f-bccb-011f386954d5" Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.298880 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" event={"ID":"a07393e2-b210-4e68-8cd3-62a838f86071","Type":"ContainerStarted","Data":"79ca74a85c118acc89afd74fbb3bf6893c38e51c207f891fad05b38c6f66f351"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.300746 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" event={"ID":"fca0a377-f77c-4e24-aec1-8ffb8ba87963","Type":"ContainerStarted","Data":"5218098da8f197c4a72bf58ebed5d921222b797dd0b6e3fc9410b4de6b7004b0"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.302873 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" event={"ID":"7c70b1a5-051f-43b5-80a3-1b462b9a50f8","Type":"ContainerStarted","Data":"fe9fe50ee501fe009a49b7f0f9d407bfe85f884cfeaf696f6034f27fcd35483c"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.304782 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" event={"ID":"998d9bc8-11c1-4967-b3d9-1c823d6c41d6","Type":"ContainerStarted","Data":"addf151e9ec72eaf1f63a469ac110ff9ef7da44cee5dd52ada86effc13416d2e"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.306003 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" event={"ID":"6a0bd4c9-4420-48be-9637-67ea2b5c89d1","Type":"ContainerStarted","Data":"0eb9ce93b82fa8a4aabd0c5ef8c6ce535b31ea6c688299743f996d7d9b9ae68c"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.307533 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" event={"ID":"c5080d31-7711-4e4a-9902-4843929a16e9","Type":"ContainerStarted","Data":"e5997302c0c8adfcb553221e1d96573913e85f690a13734a828fb13880ca1a20"} Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.349411 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:08 crc kubenswrapper[4911]: I0310 14:20:08.349575 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.349674 4911 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.349771 4911 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.349822 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:10.349798155 +0000 UTC m=+1114.913318072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "metrics-server-cert" not found Mar 10 14:20:08 crc kubenswrapper[4911]: E0310 14:20:08.349842 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:10.349836406 +0000 UTC m=+1114.913356313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "webhook-server-cert" not found Mar 10 14:20:09 crc kubenswrapper[4911]: E0310 14:20:09.328493 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" podUID="ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20" Mar 10 14:20:09 crc kubenswrapper[4911]: E0310 14:20:09.328746 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" podUID="a036efe0-e6cc-4ebe-8b06-70bc180b7b1c" Mar 10 14:20:09 crc kubenswrapper[4911]: E0310 14:20:09.330517 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" podUID="0b94f7c5-35a4-430f-bccb-011f386954d5" Mar 10 14:20:09 crc kubenswrapper[4911]: E0310 14:20:09.331260 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" podUID="c8487a91-d6ca-480d-a451-35e6516bc9e8" Mar 10 14:20:09 crc kubenswrapper[4911]: I0310 14:20:09.484614 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:09 crc kubenswrapper[4911]: E0310 14:20:09.484915 4911 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:09 crc kubenswrapper[4911]: E0310 14:20:09.485053 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert podName:c5336054-5038-40f7-8512-9fe34269f6cd nodeName:}" failed. No retries permitted until 2026-03-10 14:20:13.485020582 +0000 UTC m=+1118.048540499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert") pod "infra-operator-controller-manager-5995f4446f-jjsfs" (UID: "c5336054-5038-40f7-8512-9fe34269f6cd") : secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:09 crc kubenswrapper[4911]: I0310 14:20:09.895321 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:09 crc kubenswrapper[4911]: E0310 14:20:09.895535 4911 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:09 crc kubenswrapper[4911]: E0310 14:20:09.895645 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert podName:f82c1a17-4dc8-48c2-9bc2-9d7168524de3 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:13.895615122 +0000 UTC m=+1118.459135219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" (UID: "f82c1a17-4dc8-48c2-9bc2-9d7168524de3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:10 crc kubenswrapper[4911]: I0310 14:20:10.412963 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:10 crc kubenswrapper[4911]: I0310 14:20:10.413051 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:10 crc kubenswrapper[4911]: E0310 14:20:10.413289 4911 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 14:20:10 crc kubenswrapper[4911]: E0310 14:20:10.413363 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:14.413339367 +0000 UTC m=+1118.976859274 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "webhook-server-cert" not found Mar 10 14:20:10 crc kubenswrapper[4911]: E0310 14:20:10.413481 4911 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 14:20:10 crc kubenswrapper[4911]: E0310 14:20:10.413585 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:14.413558133 +0000 UTC m=+1118.977078200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "metrics-server-cert" not found Mar 10 14:20:13 crc kubenswrapper[4911]: I0310 14:20:13.485419 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:13 crc kubenswrapper[4911]: E0310 14:20:13.485741 4911 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:13 crc kubenswrapper[4911]: E0310 14:20:13.485849 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert podName:c5336054-5038-40f7-8512-9fe34269f6cd nodeName:}" failed. No retries permitted until 2026-03-10 14:20:21.485786696 +0000 UTC m=+1126.049306613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert") pod "infra-operator-controller-manager-5995f4446f-jjsfs" (UID: "c5336054-5038-40f7-8512-9fe34269f6cd") : secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:13 crc kubenswrapper[4911]: I0310 14:20:13.895759 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:13 crc kubenswrapper[4911]: E0310 14:20:13.896220 4911 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:13 crc kubenswrapper[4911]: E0310 14:20:13.896353 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert podName:f82c1a17-4dc8-48c2-9bc2-9d7168524de3 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:21.896321735 +0000 UTC m=+1126.459841802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" (UID: "f82c1a17-4dc8-48c2-9bc2-9d7168524de3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:14 crc kubenswrapper[4911]: I0310 14:20:14.505013 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:14 crc kubenswrapper[4911]: I0310 14:20:14.505083 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:14 crc kubenswrapper[4911]: E0310 14:20:14.505252 4911 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 14:20:14 crc kubenswrapper[4911]: E0310 14:20:14.505299 4911 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 14:20:14 crc kubenswrapper[4911]: E0310 14:20:14.505321 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:22.505303113 +0000 UTC m=+1127.068823030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "webhook-server-cert" not found Mar 10 14:20:14 crc kubenswrapper[4911]: E0310 14:20:14.505480 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:22.505406036 +0000 UTC m=+1127.068925953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "metrics-server-cert" not found Mar 10 14:20:21 crc kubenswrapper[4911]: E0310 14:20:21.477173 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6" Mar 10 14:20:21 crc kubenswrapper[4911]: E0310 14:20:21.477967 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgbvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-77b6666d85-8mqrm_openstack-operators(14dd9547-ff92-4cb4-a055-e41fd390e90e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:21 crc kubenswrapper[4911]: E0310 14:20:21.479186 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" podUID="14dd9547-ff92-4cb4-a055-e41fd390e90e" Mar 10 14:20:21 crc kubenswrapper[4911]: I0310 14:20:21.530396 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:21 crc kubenswrapper[4911]: E0310 14:20:21.530579 4911 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:21 crc kubenswrapper[4911]: E0310 14:20:21.530633 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert podName:c5336054-5038-40f7-8512-9fe34269f6cd nodeName:}" failed. No retries permitted until 2026-03-10 14:20:37.530617479 +0000 UTC m=+1142.094137396 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert") pod "infra-operator-controller-manager-5995f4446f-jjsfs" (UID: "c5336054-5038-40f7-8512-9fe34269f6cd") : secret "infra-operator-webhook-server-cert" not found Mar 10 14:20:21 crc kubenswrapper[4911]: I0310 14:20:21.936193 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:21 crc kubenswrapper[4911]: E0310 14:20:21.936440 4911 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:21 crc kubenswrapper[4911]: E0310 14:20:21.936550 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert podName:f82c1a17-4dc8-48c2-9bc2-9d7168524de3 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:37.936521126 +0000 UTC m=+1142.500041043 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" (UID: "f82c1a17-4dc8-48c2-9bc2-9d7168524de3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.362469 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.362706 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tx4bb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-wltgc_openstack-operators(902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.363929 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" podUID="902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6" Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.448381 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" podUID="902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6" Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.451430 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6\\\"\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" podUID="14dd9547-ff92-4cb4-a055-e41fd390e90e" Mar 10 14:20:22 crc kubenswrapper[4911]: I0310 14:20:22.544864 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:22 crc kubenswrapper[4911]: I0310 14:20:22.544959 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.545820 4911 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.545898 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:38.545871634 +0000 UTC m=+1143.109391551 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "metrics-server-cert" not found Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.546406 4911 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 14:20:22 crc kubenswrapper[4911]: E0310 14:20:22.546450 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs podName:0fcc8b66-2a29-45c8-a445-a14770e3f157 nodeName:}" failed. No retries permitted until 2026-03-10 14:20:38.546437539 +0000 UTC m=+1143.109957466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs") pod "openstack-operator-controller-manager-774dfd9959-g5lwx" (UID: "0fcc8b66-2a29-45c8-a445-a14770e3f157") : secret "webhook-server-cert" not found Mar 10 14:20:23 crc kubenswrapper[4911]: E0310 14:20:23.101357 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9" Mar 10 14:20:23 crc kubenswrapper[4911]: E0310 14:20:23.101889 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hs99z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-984cd4dcf-hngxq_openstack-operators(a937a94a-14cb-4319-9147-d0ac60c5cc6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:23 crc kubenswrapper[4911]: E0310 14:20:23.104886 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" podUID="a937a94a-14cb-4319-9147-d0ac60c5cc6a" Mar 10 14:20:23 crc kubenswrapper[4911]: E0310 14:20:23.456797 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" podUID="a937a94a-14cb-4319-9147-d0ac60c5cc6a" Mar 10 14:20:24 crc kubenswrapper[4911]: E0310 14:20:24.015339 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 10 14:20:24 crc kubenswrapper[4911]: E0310 14:20:24.016298 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4l545,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-5v9fq_openstack-operators(7c70b1a5-051f-43b5-80a3-1b462b9a50f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:24 crc kubenswrapper[4911]: E0310 14:20:24.017573 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" podUID="7c70b1a5-051f-43b5-80a3-1b462b9a50f8" Mar 10 14:20:24 crc kubenswrapper[4911]: E0310 14:20:24.465415 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" podUID="7c70b1a5-051f-43b5-80a3-1b462b9a50f8" Mar 10 14:20:24 crc kubenswrapper[4911]: E0310 14:20:24.799038 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 10 14:20:24 crc kubenswrapper[4911]: E0310 14:20:24.799330 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6cf65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-xw7bn_openstack-operators(998d9bc8-11c1-4967-b3d9-1c823d6c41d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:24 crc kubenswrapper[4911]: E0310 14:20:24.800542 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" podUID="998d9bc8-11c1-4967-b3d9-1c823d6c41d6" Mar 10 14:20:25 crc kubenswrapper[4911]: E0310 14:20:25.470964 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" podUID="998d9bc8-11c1-4967-b3d9-1c823d6c41d6" Mar 10 14:20:25 crc kubenswrapper[4911]: E0310 14:20:25.514695 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60" Mar 10 14:20:25 crc kubenswrapper[4911]: E0310 14:20:25.514957 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rlmzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5964f64c48-z5pz7_openstack-operators(6a0bd4c9-4420-48be-9637-67ea2b5c89d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:25 crc kubenswrapper[4911]: E0310 14:20:25.516342 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" podUID="6a0bd4c9-4420-48be-9637-67ea2b5c89d1" Mar 10 14:20:26 crc kubenswrapper[4911]: E0310 14:20:26.480882 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" podUID="6a0bd4c9-4420-48be-9637-67ea2b5c89d1" Mar 10 14:20:27 crc kubenswrapper[4911]: E0310 14:20:27.574896 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f" Mar 10 14:20:27 crc kubenswrapper[4911]: E0310 14:20:27.575517 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zbfjz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6bbb499bbc-t2qf9_openstack-operators(06a5238b-e7e1-49a5-9bb8-5f6162183a13): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:27 crc kubenswrapper[4911]: E0310 14:20:27.578595 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" podUID="06a5238b-e7e1-49a5-9bb8-5f6162183a13" Mar 10 14:20:28 crc kubenswrapper[4911]: E0310 14:20:28.063827 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922" Mar 10 14:20:28 crc kubenswrapper[4911]: E0310 14:20:28.064145 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tc6vg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-569cc54c5-57mls_openstack-operators(c5080d31-7711-4e4a-9902-4843929a16e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:28 crc kubenswrapper[4911]: E0310 14:20:28.065409 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" podUID="c5080d31-7711-4e4a-9902-4843929a16e9" Mar 10 14:20:28 crc kubenswrapper[4911]: E0310 14:20:28.498562 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" podUID="c5080d31-7711-4e4a-9902-4843929a16e9" Mar 10 14:20:28 crc kubenswrapper[4911]: E0310 14:20:28.498890 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" podUID="06a5238b-e7e1-49a5-9bb8-5f6162183a13" Mar 10 14:20:37 crc kubenswrapper[4911]: I0310 14:20:37.096766 4911 scope.go:117] "RemoveContainer" containerID="731bdd285ad9b972671a61b855174efa743f141ab1c46d2e9c0f79117ad2355a" Mar 10 14:20:37 crc kubenswrapper[4911]: I0310 14:20:37.554449 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:37 crc kubenswrapper[4911]: I0310 14:20:37.561987 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5336054-5038-40f7-8512-9fe34269f6cd-cert\") pod \"infra-operator-controller-manager-5995f4446f-jjsfs\" (UID: \"c5336054-5038-40f7-8512-9fe34269f6cd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:37 crc kubenswrapper[4911]: I0310 14:20:37.793039 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-w7xjt" Mar 10 14:20:37 crc kubenswrapper[4911]: I0310 14:20:37.799510 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:37 crc kubenswrapper[4911]: I0310 14:20:37.986132 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:37 crc kubenswrapper[4911]: I0310 14:20:37.992116 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82c1a17-4dc8-48c2-9bc2-9d7168524de3-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h\" (UID: \"f82c1a17-4dc8-48c2-9bc2-9d7168524de3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:38 crc kubenswrapper[4911]: I0310 14:20:38.176174 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nhpnk" Mar 10 14:20:38 crc kubenswrapper[4911]: I0310 14:20:38.183564 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:38 crc kubenswrapper[4911]: I0310 14:20:38.596555 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:38 crc kubenswrapper[4911]: I0310 14:20:38.596624 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:38 crc kubenswrapper[4911]: I0310 14:20:38.599784 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-webhook-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:38 crc kubenswrapper[4911]: I0310 14:20:38.600490 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fcc8b66-2a29-45c8-a445-a14770e3f157-metrics-certs\") pod \"openstack-operator-controller-manager-774dfd9959-g5lwx\" (UID: \"0fcc8b66-2a29-45c8-a445-a14770e3f157\") " pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:38 crc kubenswrapper[4911]: I0310 14:20:38.845182 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fsg9z" Mar 10 14:20:38 crc kubenswrapper[4911]: I0310 14:20:38.853677 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:39 crc kubenswrapper[4911]: E0310 14:20:39.754803 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 10 14:20:39 crc kubenswrapper[4911]: E0310 14:20:39.755933 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nxgn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7n8pz_openstack-operators(0b94f7c5-35a4-430f-bccb-011f386954d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:20:39 crc kubenswrapper[4911]: E0310 14:20:39.757342 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" podUID="0b94f7c5-35a4-430f-bccb-011f386954d5" Mar 10 14:20:40 crc kubenswrapper[4911]: I0310 14:20:40.453520 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs"] Mar 10 14:20:40 crc kubenswrapper[4911]: W0310 14:20:40.493815 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5336054_5038_40f7_8512_9fe34269f6cd.slice/crio-5c592311c8a5650709d5d001154d86ae0f508ddaa59bd80ce2a454b43f83b5dc WatchSource:0}: Error finding container 5c592311c8a5650709d5d001154d86ae0f508ddaa59bd80ce2a454b43f83b5dc: Status 404 returned error can't find the container with id 5c592311c8a5650709d5d001154d86ae0f508ddaa59bd80ce2a454b43f83b5dc Mar 10 14:20:40 crc kubenswrapper[4911]: I0310 14:20:40.530072 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx"] Mar 10 14:20:40 crc kubenswrapper[4911]: I0310 14:20:40.602946 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h"] Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.220665 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" event={"ID":"c8487a91-d6ca-480d-a451-35e6516bc9e8","Type":"ContainerStarted","Data":"af09d64f8203c4c6bf469f29a8eb864be7fce4ed152eab109e69d577bcdfe4a6"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.221715 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.230476 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" event={"ID":"90f49412-d2c3-46ba-9591-5adee9624834","Type":"ContainerStarted","Data":"6622722413b7c2a1d55f4deccb55d96055f1a2303ccf583f358790e709eee6c6"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.230831 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.232557 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" event={"ID":"0dabe548-2d6c-4bbb-8199-6403e57d2ac9","Type":"ContainerStarted","Data":"581b04557e9d39fd89a1c64f99bb998135f4f7fa40d96563a0450a9378a15cfe"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.233377 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.236341 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" event={"ID":"c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298","Type":"ContainerStarted","Data":"7234d8c7560def6d67b3c659b787b621fbfaf9ebdde6a8a86d26cd1645fc0a6c"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.237059 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.239744 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" event={"ID":"0fcc8b66-2a29-45c8-a445-a14770e3f157","Type":"ContainerStarted","Data":"4e0d23f149b652aee099f1ae9b142c0a57b8581ec795e5f92185f09299fcdc29"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.239862 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.255812 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" event={"ID":"53d2a376-b957-4875-8bfe-42d5dbc0a634","Type":"ContainerStarted","Data":"4f35d9d5ca15e8cb4913bb9339c6d900ab0c6ee49bac4ce26b405f509f959d55"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.256408 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.286116 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" event={"ID":"14dd9547-ff92-4cb4-a055-e41fd390e90e","Type":"ContainerStarted","Data":"6fc2e469bf0ce1d8ae19e87cb1eea5f333d5c9d6a7913c1f8d5da9d1e25d8eaa"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.287302 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.316160 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" event={"ID":"6a0bd4c9-4420-48be-9637-67ea2b5c89d1","Type":"ContainerStarted","Data":"4959ebddb68d1b768c309f45e8e29ae04dbb18b1120102bd5e4a20f5e78cb223"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.317149 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.338113 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" event={"ID":"fca0a377-f77c-4e24-aec1-8ffb8ba87963","Type":"ContainerStarted","Data":"027569ebd8026338054edf64398002080cd9ea619d5a2a8e3ddfb0abe8c44498"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.338648 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.361965 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" event={"ID":"a07393e2-b210-4e68-8cd3-62a838f86071","Type":"ContainerStarted","Data":"1ae08a3ac52e79a4305170a20b025e84a003c3985e49166cabfb85c2138916fe"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.362619 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.379560 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" event={"ID":"7c70b1a5-051f-43b5-80a3-1b462b9a50f8","Type":"ContainerStarted","Data":"f589f98ca05919b5c861c263c8e0f118de99681910f5d9e7a8ae965aa2b8ef94"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.380355 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.396942 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" event={"ID":"a937a94a-14cb-4319-9147-d0ac60c5cc6a","Type":"ContainerStarted","Data":"21c74ce9745565834077e13e6c45fe0f61ac3dab0d02e0391f3217c490c9bb54"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.397970 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.416895 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" event={"ID":"ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20","Type":"ContainerStarted","Data":"c5ef24b3d54efcf5620f6a1c2e8d7cc027a47b5f1a0e27d2c3d69807b5da9cb9"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.417649 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.424377 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" podStartSLOduration=4.180349742 podStartE2EDuration="36.424357708s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:08.073564348 +0000 UTC m=+1112.637084255" lastFinishedPulling="2026-03-10 14:20:40.317572304 +0000 UTC m=+1144.881092221" observedRunningTime="2026-03-10 14:20:41.423175057 +0000 UTC m=+1145.986694964" watchObservedRunningTime="2026-03-10 14:20:41.424357708 +0000 UTC m=+1145.987877625" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.440110 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" event={"ID":"e2b89cc3-8229-4401-b8e9-9a32bffb0f57","Type":"ContainerStarted","Data":"693044e5b2b40506d0812cea73d1cf16e617655ec6c7745352f48e9cce6be1c3"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.440625 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.441215 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" event={"ID":"c5336054-5038-40f7-8512-9fe34269f6cd","Type":"ContainerStarted","Data":"5c592311c8a5650709d5d001154d86ae0f508ddaa59bd80ce2a454b43f83b5dc"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.443461 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" event={"ID":"902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6","Type":"ContainerStarted","Data":"318fc29d5ced972828fff4b823fab72fc834d494144318f9b5ae6819fe58e0b3"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.443914 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.446026 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" event={"ID":"998d9bc8-11c1-4967-b3d9-1c823d6c41d6","Type":"ContainerStarted","Data":"ecbe50b5ef62bb7c2ef9db1d0fd626755e28b84c0d61b682642c7378cb52ff86"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.446931 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.470619 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" event={"ID":"f82c1a17-4dc8-48c2-9bc2-9d7168524de3","Type":"ContainerStarted","Data":"f33c4fb57e6c2a2cca9ce6751556c92aedf7c3876851238eab61bb18dbfdd799"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.526697 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" event={"ID":"5cf94e3e-325d-4364-bf70-c479683b2be6","Type":"ContainerStarted","Data":"54028c48b84f29978a07ba92d1891e1f43fc160c9a969afa42fff4e51ebf5fcd"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.527462 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.555173 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" event={"ID":"a036efe0-e6cc-4ebe-8b06-70bc180b7b1c","Type":"ContainerStarted","Data":"8c2f279cb70b42329fd563e9df1bdd33b3b5ff24003ac3e7454f9c3114147857"} Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.556289 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.564085 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" podStartSLOduration=13.45398204 podStartE2EDuration="36.564053174s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.33002618 +0000 UTC m=+1111.893546097" lastFinishedPulling="2026-03-10 14:20:30.440097314 +0000 UTC m=+1135.003617231" observedRunningTime="2026-03-10 14:20:41.555847517 +0000 UTC m=+1146.119367434" watchObservedRunningTime="2026-03-10 14:20:41.564053174 +0000 UTC m=+1146.127573091" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.805275 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" podStartSLOduration=13.175408572 podStartE2EDuration="36.805249523s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:06.810247632 +0000 UTC m=+1111.373767549" lastFinishedPulling="2026-03-10 14:20:30.440088543 +0000 UTC m=+1135.003608500" observedRunningTime="2026-03-10 14:20:41.701933681 +0000 UTC m=+1146.265453598" watchObservedRunningTime="2026-03-10 14:20:41.805249523 +0000 UTC m=+1146.368769430" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.914189 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" podStartSLOduration=14.458841261 podStartE2EDuration="36.914169525s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.98479761 +0000 UTC m=+1112.548317527" lastFinishedPulling="2026-03-10 14:20:30.440125864 +0000 UTC m=+1135.003645791" observedRunningTime="2026-03-10 14:20:41.911158375 +0000 UTC m=+1146.474678292" watchObservedRunningTime="2026-03-10 14:20:41.914169525 +0000 UTC m=+1146.477689442" Mar 10 14:20:41 crc kubenswrapper[4911]: I0310 14:20:41.915569 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" podStartSLOduration=4.472592443 podStartE2EDuration="36.915563891s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.633016445 +0000 UTC m=+1112.196536362" lastFinishedPulling="2026-03-10 14:20:40.075987893 +0000 UTC m=+1144.639507810" observedRunningTime="2026-03-10 14:20:41.802127551 +0000 UTC m=+1146.365647458" watchObservedRunningTime="2026-03-10 14:20:41.915563891 +0000 UTC m=+1146.479083808" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.131473 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" podStartSLOduration=36.131446772 podStartE2EDuration="36.131446772s" podCreationTimestamp="2026-03-10 14:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:20:42.126492651 +0000 UTC m=+1146.690012568" watchObservedRunningTime="2026-03-10 14:20:42.131446772 +0000 UTC m=+1146.694966689" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.199444 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" podStartSLOduration=5.521735704 podStartE2EDuration="37.19942041s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:08.131030068 +0000 UTC m=+1112.694549985" lastFinishedPulling="2026-03-10 14:20:39.808714774 +0000 UTC m=+1144.372234691" observedRunningTime="2026-03-10 14:20:42.188297746 +0000 UTC m=+1146.751817663" watchObservedRunningTime="2026-03-10 14:20:42.19942041 +0000 UTC m=+1146.762940327" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.296696 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" podStartSLOduration=14.542324719 podStartE2EDuration="37.296672202s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.685667978 +0000 UTC m=+1112.249187895" lastFinishedPulling="2026-03-10 14:20:30.440015451 +0000 UTC m=+1135.003535378" observedRunningTime="2026-03-10 14:20:42.294954297 +0000 UTC m=+1146.858474214" watchObservedRunningTime="2026-03-10 14:20:42.296672202 +0000 UTC m=+1146.860192119" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.339647 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" podStartSLOduration=14.212665998 podStartE2EDuration="37.339621028s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.311289605 +0000 UTC m=+1111.874809522" lastFinishedPulling="2026-03-10 14:20:30.438244595 +0000 UTC m=+1135.001764552" observedRunningTime="2026-03-10 14:20:42.337294977 +0000 UTC m=+1146.900814904" watchObservedRunningTime="2026-03-10 14:20:42.339621028 +0000 UTC m=+1146.903140945" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.404414 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" podStartSLOduration=13.907310901 podStartE2EDuration="37.404384731s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:06.944168784 +0000 UTC m=+1111.507688701" lastFinishedPulling="2026-03-10 14:20:30.441242614 +0000 UTC m=+1135.004762531" observedRunningTime="2026-03-10 14:20:42.395940818 +0000 UTC m=+1146.959460735" watchObservedRunningTime="2026-03-10 14:20:42.404384731 +0000 UTC m=+1146.967904648" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.438001 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" podStartSLOduration=5.3578762 podStartE2EDuration="37.43796333s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.950939324 +0000 UTC m=+1112.514459241" lastFinishedPulling="2026-03-10 14:20:40.031026454 +0000 UTC m=+1144.594546371" observedRunningTime="2026-03-10 14:20:42.421244297 +0000 UTC m=+1146.984764214" watchObservedRunningTime="2026-03-10 14:20:42.43796333 +0000 UTC m=+1147.001483247" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.556709 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" podStartSLOduration=15.0913274 podStartE2EDuration="37.55668537s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.975001891 +0000 UTC m=+1112.538521808" lastFinishedPulling="2026-03-10 14:20:30.440359831 +0000 UTC m=+1135.003879778" observedRunningTime="2026-03-10 14:20:42.555363785 +0000 UTC m=+1147.118883702" watchObservedRunningTime="2026-03-10 14:20:42.55668537 +0000 UTC m=+1147.120205277" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.557208 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" podStartSLOduration=5.211451027 podStartE2EDuration="37.557202894s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.685686068 +0000 UTC m=+1112.249205985" lastFinishedPulling="2026-03-10 14:20:40.031437915 +0000 UTC m=+1144.594957852" observedRunningTime="2026-03-10 14:20:42.526630265 +0000 UTC m=+1147.090150182" watchObservedRunningTime="2026-03-10 14:20:42.557202894 +0000 UTC m=+1147.120722811" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.580828 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" podStartSLOduration=4.9766246370000005 podStartE2EDuration="36.580795058s" podCreationTimestamp="2026-03-10 14:20:06 +0000 UTC" firstStartedPulling="2026-03-10 14:20:08.194280971 +0000 UTC m=+1112.757800888" lastFinishedPulling="2026-03-10 14:20:39.798451392 +0000 UTC m=+1144.361971309" observedRunningTime="2026-03-10 14:20:42.577570973 +0000 UTC m=+1147.141090890" watchObservedRunningTime="2026-03-10 14:20:42.580795058 +0000 UTC m=+1147.144314975" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.587036 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" event={"ID":"c5080d31-7711-4e4a-9902-4843929a16e9","Type":"ContainerStarted","Data":"47cc3052d17475b7c2d894212bc5a82c77441d4018d6876dc022b8cb38763164"} Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.587835 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.603389 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" event={"ID":"0fcc8b66-2a29-45c8-a445-a14770e3f157","Type":"ContainerStarted","Data":"5ebce6ec29a8ed1972059a24c3ed166e397a2884fe26ce3ff3000844acfb599f"} Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.625419 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" podStartSLOduration=15.030028129 podStartE2EDuration="37.625398278s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.844836728 +0000 UTC m=+1112.408356645" lastFinishedPulling="2026-03-10 14:20:30.440206877 +0000 UTC m=+1135.003726794" observedRunningTime="2026-03-10 14:20:42.623011675 +0000 UTC m=+1147.186531602" watchObservedRunningTime="2026-03-10 14:20:42.625398278 +0000 UTC m=+1147.188918195" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.698539 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" podStartSLOduration=4.831491357 podStartE2EDuration="37.698518612s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.164355578 +0000 UTC m=+1111.727875515" lastFinishedPulling="2026-03-10 14:20:40.031382853 +0000 UTC m=+1144.594902770" observedRunningTime="2026-03-10 14:20:42.662050677 +0000 UTC m=+1147.225570584" watchObservedRunningTime="2026-03-10 14:20:42.698518612 +0000 UTC m=+1147.262038519" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.700290 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" podStartSLOduration=6.031985411 podStartE2EDuration="37.700282478s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:08.130798532 +0000 UTC m=+1112.694318449" lastFinishedPulling="2026-03-10 14:20:39.799095559 +0000 UTC m=+1144.362615516" observedRunningTime="2026-03-10 14:20:42.696292463 +0000 UTC m=+1147.259812380" watchObservedRunningTime="2026-03-10 14:20:42.700282478 +0000 UTC m=+1147.263802395" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.727513 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" podStartSLOduration=5.734890192 podStartE2EDuration="37.727494398s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:08.098457266 +0000 UTC m=+1112.661977183" lastFinishedPulling="2026-03-10 14:20:40.091061472 +0000 UTC m=+1144.654581389" observedRunningTime="2026-03-10 14:20:42.722874476 +0000 UTC m=+1147.286394393" watchObservedRunningTime="2026-03-10 14:20:42.727494398 +0000 UTC m=+1147.291014315" Mar 10 14:20:42 crc kubenswrapper[4911]: I0310 14:20:42.766404 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" podStartSLOduration=3.726953932 podStartE2EDuration="37.766381607s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.840932995 +0000 UTC m=+1112.404452912" lastFinishedPulling="2026-03-10 14:20:41.88036068 +0000 UTC m=+1146.443880587" observedRunningTime="2026-03-10 14:20:42.762054652 +0000 UTC m=+1147.325574569" watchObservedRunningTime="2026-03-10 14:20:42.766381607 +0000 UTC m=+1147.329901544" Mar 10 14:20:43 crc kubenswrapper[4911]: I0310 14:20:43.614745 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" event={"ID":"06a5238b-e7e1-49a5-9bb8-5f6162183a13","Type":"ContainerStarted","Data":"8b8d017fe8b9b47ef9f177c1ccd51e107c27dabec255b4635180c671b9ea4b19"} Mar 10 14:20:43 crc kubenswrapper[4911]: I0310 14:20:43.616502 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" Mar 10 14:20:43 crc kubenswrapper[4911]: I0310 14:20:43.641265 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" podStartSLOduration=3.419113978 podStartE2EDuration="38.641238678s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:07.526538608 +0000 UTC m=+1112.090058525" lastFinishedPulling="2026-03-10 14:20:42.748663308 +0000 UTC m=+1147.312183225" observedRunningTime="2026-03-10 14:20:43.638486855 +0000 UTC m=+1148.202006772" watchObservedRunningTime="2026-03-10 14:20:43.641238678 +0000 UTC m=+1148.204758595" Mar 10 14:20:45 crc kubenswrapper[4911]: I0310 14:20:45.844210 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-wv8p7" Mar 10 14:20:45 crc kubenswrapper[4911]: I0310 14:20:45.862233 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hngxq" Mar 10 14:20:45 crc kubenswrapper[4911]: I0310 14:20:45.875149 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-t2lgw" Mar 10 14:20:45 crc kubenswrapper[4911]: I0310 14:20:45.967176 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-72kfl" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.047919 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xw7bn" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.098891 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8mq9k" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.206245 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8mqrm" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.206298 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7pcfv" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.237566 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-z5pz7" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.274549 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5v9fq" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.309230 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cqdch" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.441378 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-2mnh2" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.534431 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-khnvw" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.543491 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-8jk8x" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.668044 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" event={"ID":"c5336054-5038-40f7-8512-9fe34269f6cd","Type":"ContainerStarted","Data":"1f36b4265d1ced4f28887809a57d967a0d12f0cf6975c9586442a9eba660fdbf"} Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.669179 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.670180 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" event={"ID":"f82c1a17-4dc8-48c2-9bc2-9d7168524de3","Type":"ContainerStarted","Data":"37c94312b7a839a9b3ac255e6d7553211b15f86ea628b0016f5fcb321e084453"} Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.671075 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.690336 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" podStartSLOduration=35.984517715 podStartE2EDuration="41.690311219s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:40.498182321 +0000 UTC m=+1145.061702238" lastFinishedPulling="2026-03-10 14:20:46.203975825 +0000 UTC m=+1150.767495742" observedRunningTime="2026-03-10 14:20:46.688953163 +0000 UTC m=+1151.252473080" watchObservedRunningTime="2026-03-10 14:20:46.690311219 +0000 UTC m=+1151.253831136" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.718162 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" podStartSLOduration=36.189410145 podStartE2EDuration="41.718125365s" podCreationTimestamp="2026-03-10 14:20:05 +0000 UTC" firstStartedPulling="2026-03-10 14:20:40.676170749 +0000 UTC m=+1145.239690666" lastFinishedPulling="2026-03-10 14:20:46.204885969 +0000 UTC m=+1150.768405886" observedRunningTime="2026-03-10 14:20:46.716928423 +0000 UTC m=+1151.280448350" watchObservedRunningTime="2026-03-10 14:20:46.718125365 +0000 UTC m=+1151.281645282" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.769337 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-pgf4d" Mar 10 14:20:46 crc kubenswrapper[4911]: I0310 14:20:46.850268 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wltgc" Mar 10 14:20:47 crc kubenswrapper[4911]: I0310 14:20:47.032316 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fmrnr" Mar 10 14:20:48 crc kubenswrapper[4911]: I0310 14:20:48.520949 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:20:48 crc kubenswrapper[4911]: I0310 14:20:48.521600 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:20:48 crc kubenswrapper[4911]: I0310 14:20:48.863006 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-774dfd9959-g5lwx" Mar 10 14:20:52 crc kubenswrapper[4911]: E0310 14:20:52.196883 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" podUID="0b94f7c5-35a4-430f-bccb-011f386954d5" Mar 10 14:20:56 crc kubenswrapper[4911]: I0310 14:20:56.026106 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-t2qf9" Mar 10 14:20:56 crc kubenswrapper[4911]: I0310 14:20:56.332666 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-57mls" Mar 10 14:20:57 crc kubenswrapper[4911]: I0310 14:20:57.812138 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-jjsfs" Mar 10 14:20:58 crc kubenswrapper[4911]: I0310 14:20:58.189539 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h" Mar 10 14:21:06 crc kubenswrapper[4911]: I0310 14:21:06.837075 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" event={"ID":"0b94f7c5-35a4-430f-bccb-011f386954d5","Type":"ContainerStarted","Data":"8c105df65d9e6304d702eea51a2679c49814724d2d99ccc6f53c97c5d64b96c7"} Mar 10 14:21:06 crc kubenswrapper[4911]: I0310 14:21:06.853765 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7n8pz" podStartSLOduration=3.41414098 podStartE2EDuration="1m0.853738822s" podCreationTimestamp="2026-03-10 14:20:06 +0000 UTC" firstStartedPulling="2026-03-10 14:20:08.174440516 +0000 UTC m=+1112.737960423" lastFinishedPulling="2026-03-10 14:21:05.614038348 +0000 UTC m=+1170.177558265" observedRunningTime="2026-03-10 14:21:06.850867741 +0000 UTC m=+1171.414387658" watchObservedRunningTime="2026-03-10 14:21:06.853738822 +0000 UTC m=+1171.417258739" Mar 10 14:21:18 crc kubenswrapper[4911]: I0310 14:21:18.521130 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:21:18 crc kubenswrapper[4911]: I0310 14:21:18.522558 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.840660 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fp6h2"] Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.844565 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.848445 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5fjbb" Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.848661 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.848816 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.848548 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.862742 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fp6h2"] Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.901087 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2hzb7"] Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.902932 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.906270 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 14:21:26 crc kubenswrapper[4911]: I0310 14:21:26.921709 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2hzb7"] Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.042774 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpssc\" (UniqueName: \"kubernetes.io/projected/b7f7e715-7aa2-4903-b925-3763fee5918d-kube-api-access-hpssc\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.043524 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.044287 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fa709c-20ac-4e81-ad9e-6755042b11ef-config\") pod \"dnsmasq-dns-675f4bcbfc-fp6h2\" (UID: \"13fa709c-20ac-4e81-ad9e-6755042b11ef\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.044456 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-config\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.044632 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xpsp\" (UniqueName: \"kubernetes.io/projected/13fa709c-20ac-4e81-ad9e-6755042b11ef-kube-api-access-4xpsp\") pod \"dnsmasq-dns-675f4bcbfc-fp6h2\" (UID: \"13fa709c-20ac-4e81-ad9e-6755042b11ef\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.146773 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xpsp\" (UniqueName: \"kubernetes.io/projected/13fa709c-20ac-4e81-ad9e-6755042b11ef-kube-api-access-4xpsp\") pod \"dnsmasq-dns-675f4bcbfc-fp6h2\" (UID: \"13fa709c-20ac-4e81-ad9e-6755042b11ef\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.146870 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpssc\" (UniqueName: \"kubernetes.io/projected/b7f7e715-7aa2-4903-b925-3763fee5918d-kube-api-access-hpssc\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.146930 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.146979 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fa709c-20ac-4e81-ad9e-6755042b11ef-config\") pod \"dnsmasq-dns-675f4bcbfc-fp6h2\" (UID: \"13fa709c-20ac-4e81-ad9e-6755042b11ef\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.147014 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-config\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.148242 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.148686 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-config\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.148851 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fa709c-20ac-4e81-ad9e-6755042b11ef-config\") pod \"dnsmasq-dns-675f4bcbfc-fp6h2\" (UID: \"13fa709c-20ac-4e81-ad9e-6755042b11ef\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.170259 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xpsp\" (UniqueName: \"kubernetes.io/projected/13fa709c-20ac-4e81-ad9e-6755042b11ef-kube-api-access-4xpsp\") pod \"dnsmasq-dns-675f4bcbfc-fp6h2\" (UID: \"13fa709c-20ac-4e81-ad9e-6755042b11ef\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.170761 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpssc\" (UniqueName: \"kubernetes.io/projected/b7f7e715-7aa2-4903-b925-3763fee5918d-kube-api-access-hpssc\") pod \"dnsmasq-dns-78dd6ddcc-2hzb7\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.227497 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.467749 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:27 crc kubenswrapper[4911]: I0310 14:21:27.500324 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2hzb7"] Mar 10 14:21:28 crc kubenswrapper[4911]: I0310 14:21:28.204521 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" event={"ID":"b7f7e715-7aa2-4903-b925-3763fee5918d","Type":"ContainerStarted","Data":"fb0b5c3cc30b455b99d2510c4b52eec20ed55a04e539f7259f23f58169cdb142"} Mar 10 14:21:28 crc kubenswrapper[4911]: I0310 14:21:28.205317 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fp6h2"] Mar 10 14:21:28 crc kubenswrapper[4911]: W0310 14:21:28.210077 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13fa709c_20ac_4e81_ad9e_6755042b11ef.slice/crio-1002101d2b25278372b834a865f6d02f5af8388bcb9b3134a23beddc9ff0c8f8 WatchSource:0}: Error finding container 1002101d2b25278372b834a865f6d02f5af8388bcb9b3134a23beddc9ff0c8f8: Status 404 returned error can't find the container with id 1002101d2b25278372b834a865f6d02f5af8388bcb9b3134a23beddc9ff0c8f8 Mar 10 14:21:29 crc kubenswrapper[4911]: I0310 14:21:29.217922 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" event={"ID":"13fa709c-20ac-4e81-ad9e-6755042b11ef","Type":"ContainerStarted","Data":"1002101d2b25278372b834a865f6d02f5af8388bcb9b3134a23beddc9ff0c8f8"} Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.134346 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fp6h2"] Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.212807 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5xdw"] Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.213993 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.227751 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5xdw"] Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.317554 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-config\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.317648 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.317948 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv4mt\" (UniqueName: \"kubernetes.io/projected/d536338c-ff72-4f8c-99c6-59b3feba5cf8-kube-api-access-gv4mt\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.432244 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-config\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.432293 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.432364 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv4mt\" (UniqueName: \"kubernetes.io/projected/d536338c-ff72-4f8c-99c6-59b3feba5cf8-kube-api-access-gv4mt\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.433535 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-config\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.433535 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.494986 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv4mt\" (UniqueName: \"kubernetes.io/projected/d536338c-ff72-4f8c-99c6-59b3feba5cf8-kube-api-access-gv4mt\") pod \"dnsmasq-dns-5ccc8479f9-h5xdw\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.540047 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.562132 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2hzb7"] Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.585397 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ffg46"] Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.588898 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.610737 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ffg46"] Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.635112 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-config\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.635257 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.635300 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lmfp\" (UniqueName: \"kubernetes.io/projected/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-kube-api-access-9lmfp\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.737678 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.737757 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lmfp\" (UniqueName: \"kubernetes.io/projected/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-kube-api-access-9lmfp\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.737809 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-config\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.738898 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-config\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.738951 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.777744 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lmfp\" (UniqueName: \"kubernetes.io/projected/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-kube-api-access-9lmfp\") pod \"dnsmasq-dns-57d769cc4f-ffg46\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:30 crc kubenswrapper[4911]: I0310 14:21:30.965784 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.285438 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5xdw"] Mar 10 14:21:31 crc kubenswrapper[4911]: W0310 14:21:31.321272 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd536338c_ff72_4f8c_99c6_59b3feba5cf8.slice/crio-ad8398624c564a2eea18748fa7ddcb741e8302f723392ea315383b43ea6986e3 WatchSource:0}: Error finding container ad8398624c564a2eea18748fa7ddcb741e8302f723392ea315383b43ea6986e3: Status 404 returned error can't find the container with id ad8398624c564a2eea18748fa7ddcb741e8302f723392ea315383b43ea6986e3 Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.391391 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.399271 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.401191 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.401560 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.402019 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.402281 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.402419 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.402833 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.402947 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.403138 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s2nqt" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.554403 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ffg46"] Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561094 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561154 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561323 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561448 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561660 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561696 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kr66\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-kube-api-access-2kr66\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561766 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561811 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561861 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561882 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.561970 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.663870 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.663968 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.663995 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kr66\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-kube-api-access-2kr66\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664031 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664058 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664084 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664101 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664123 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664144 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664165 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664187 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.664746 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.668846 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.668957 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.670238 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.670292 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.670909 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.673874 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.676827 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.677553 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.678299 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.689768 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kr66\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-kube-api-access-2kr66\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.692151 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.734946 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.752916 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.754603 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.757365 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.757775 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.762473 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-q998f" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.762625 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.764466 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.765929 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.770026 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.802800 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.866707 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.866795 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.866851 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.866918 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.866937 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.866987 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.867079 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgpfh\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-kube-api-access-jgpfh\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.867101 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-pod-info\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.867200 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-server-conf\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.867246 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-config-data\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.867327 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.968836 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-pod-info\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.968926 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-server-conf\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.968947 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-config-data\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.968979 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.969009 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.969028 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.969048 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.969071 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.969087 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.969111 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.969134 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgpfh\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-kube-api-access-jgpfh\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.975042 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.976337 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.976967 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.978468 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-config-data\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.979036 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.979952 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:31 crc kubenswrapper[4911]: I0310 14:21:31.982164 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-server-conf\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.000000 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.015587 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.016144 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.024794 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgpfh\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-kube-api-access-jgpfh\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.033462 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-pod-info\") pod \"rabbitmq-server-0\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " pod="openstack/rabbitmq-server-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.139116 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.308540 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" event={"ID":"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa","Type":"ContainerStarted","Data":"d5e5d356eff1f14a5218d82602074c306de5d67b6541908ce44c1cf19cd8f5e9"} Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.310529 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" event={"ID":"d536338c-ff72-4f8c-99c6-59b3feba5cf8","Type":"ContainerStarted","Data":"ad8398624c564a2eea18748fa7ddcb741e8302f723392ea315383b43ea6986e3"} Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.387888 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.391558 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.396046 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.401857 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.401907 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.402631 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gjv8z" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.402962 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.410245 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.413948 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:21:32 crc kubenswrapper[4911]: W0310 14:21:32.421473 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1a8c73_283d_431f_bfd3_af06ca3c60ff.slice/crio-7df7970bd92215495a3ffe6976980013f2c6be864288ef8501b7654a7af92992 WatchSource:0}: Error finding container 7df7970bd92215495a3ffe6976980013f2c6be864288ef8501b7654a7af92992: Status 404 returned error can't find the container with id 7df7970bd92215495a3ffe6976980013f2c6be864288ef8501b7654a7af92992 Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.480469 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5ff8ebc9-ea10-4e9c-be23-96608817ed84-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.480539 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46gsd\" (UniqueName: \"kubernetes.io/projected/5ff8ebc9-ea10-4e9c-be23-96608817ed84-kube-api-access-46gsd\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.480692 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-config-data-default\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.480801 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff8ebc9-ea10-4e9c-be23-96608817ed84-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.480839 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff8ebc9-ea10-4e9c-be23-96608817ed84-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.480896 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.480928 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-kolla-config\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.480963 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.583600 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.583675 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-kolla-config\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.583809 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.583864 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5ff8ebc9-ea10-4e9c-be23-96608817ed84-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.583890 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46gsd\" (UniqueName: \"kubernetes.io/projected/5ff8ebc9-ea10-4e9c-be23-96608817ed84-kube-api-access-46gsd\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.583916 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-config-data-default\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.583948 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff8ebc9-ea10-4e9c-be23-96608817ed84-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.583981 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff8ebc9-ea10-4e9c-be23-96608817ed84-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.585830 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5ff8ebc9-ea10-4e9c-be23-96608817ed84-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.585999 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.587457 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-config-data-default\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.588154 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-kolla-config\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.589147 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ff8ebc9-ea10-4e9c-be23-96608817ed84-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.599080 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff8ebc9-ea10-4e9c-be23-96608817ed84-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.611084 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff8ebc9-ea10-4e9c-be23-96608817ed84-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.616405 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.619704 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46gsd\" (UniqueName: \"kubernetes.io/projected/5ff8ebc9-ea10-4e9c-be23-96608817ed84-kube-api-access-46gsd\") pod \"openstack-galera-0\" (UID: \"5ff8ebc9-ea10-4e9c-be23-96608817ed84\") " pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.715416 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 14:21:32 crc kubenswrapper[4911]: I0310 14:21:32.807579 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:21:33 crc kubenswrapper[4911]: I0310 14:21:33.371089 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d1a8c73-283d-431f-bfd3-af06ca3c60ff","Type":"ContainerStarted","Data":"7df7970bd92215495a3ffe6976980013f2c6be864288ef8501b7654a7af92992"} Mar 10 14:21:33 crc kubenswrapper[4911]: I0310 14:21:33.915515 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 14:21:33 crc kubenswrapper[4911]: I0310 14:21:33.917243 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:33 crc kubenswrapper[4911]: I0310 14:21:33.920741 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 14:21:33 crc kubenswrapper[4911]: I0310 14:21:33.922654 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 14:21:33 crc kubenswrapper[4911]: I0310 14:21:33.922853 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-k5zp4" Mar 10 14:21:33 crc kubenswrapper[4911]: I0310 14:21:33.922973 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 14:21:33 crc kubenswrapper[4911]: I0310 14:21:33.940862 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.017686 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.017759 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.017799 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6tm\" (UniqueName: \"kubernetes.io/projected/6e7efec5-8494-472d-b149-a6aeed4810b2-kube-api-access-9n6tm\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.017845 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e7efec5-8494-472d-b149-a6aeed4810b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.018080 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.018199 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7efec5-8494-472d-b149-a6aeed4810b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.018326 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e7efec5-8494-472d-b149-a6aeed4810b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.018396 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.085551 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.086832 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.093559 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-kddgk" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.093764 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.093998 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.100767 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.123844 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e7efec5-8494-472d-b149-a6aeed4810b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.123911 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.123977 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5789v\" (UniqueName: \"kubernetes.io/projected/45599797-9a4e-428b-8f95-39b6db7bd84e-kube-api-access-5789v\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124332 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7efec5-8494-472d-b149-a6aeed4810b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124377 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45599797-9a4e-428b-8f95-39b6db7bd84e-kolla-config\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124424 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/45599797-9a4e-428b-8f95-39b6db7bd84e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124456 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e7efec5-8494-472d-b149-a6aeed4810b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124478 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45599797-9a4e-428b-8f95-39b6db7bd84e-config-data\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124512 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124598 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124636 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45599797-9a4e-428b-8f95-39b6db7bd84e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124756 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.124871 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6tm\" (UniqueName: \"kubernetes.io/projected/6e7efec5-8494-472d-b149-a6aeed4810b2-kube-api-access-9n6tm\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.130480 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.132782 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e7efec5-8494-472d-b149-a6aeed4810b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.134752 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.137660 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7efec5-8494-472d-b149-a6aeed4810b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.138260 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.138961 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e7efec5-8494-472d-b149-a6aeed4810b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.146203 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6tm\" (UniqueName: \"kubernetes.io/projected/6e7efec5-8494-472d-b149-a6aeed4810b2-kube-api-access-9n6tm\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.151032 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e7efec5-8494-472d-b149-a6aeed4810b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.153170 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6e7efec5-8494-472d-b149-a6aeed4810b2\") " pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.227533 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5789v\" (UniqueName: \"kubernetes.io/projected/45599797-9a4e-428b-8f95-39b6db7bd84e-kube-api-access-5789v\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.227643 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45599797-9a4e-428b-8f95-39b6db7bd84e-kolla-config\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.229619 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/45599797-9a4e-428b-8f95-39b6db7bd84e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.229834 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45599797-9a4e-428b-8f95-39b6db7bd84e-kolla-config\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.230255 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45599797-9a4e-428b-8f95-39b6db7bd84e-config-data\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.231116 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45599797-9a4e-428b-8f95-39b6db7bd84e-config-data\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.231391 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45599797-9a4e-428b-8f95-39b6db7bd84e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.234093 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45599797-9a4e-428b-8f95-39b6db7bd84e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.248841 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5789v\" (UniqueName: \"kubernetes.io/projected/45599797-9a4e-428b-8f95-39b6db7bd84e-kube-api-access-5789v\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.249396 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.251176 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/45599797-9a4e-428b-8f95-39b6db7bd84e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"45599797-9a4e-428b-8f95-39b6db7bd84e\") " pod="openstack/memcached-0" Mar 10 14:21:34 crc kubenswrapper[4911]: I0310 14:21:34.492158 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 14:21:36 crc kubenswrapper[4911]: I0310 14:21:36.299944 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:21:36 crc kubenswrapper[4911]: I0310 14:21:36.301440 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 14:21:36 crc kubenswrapper[4911]: I0310 14:21:36.305483 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-b4tk5" Mar 10 14:21:36 crc kubenswrapper[4911]: I0310 14:21:36.316041 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:21:36 crc kubenswrapper[4911]: I0310 14:21:36.372651 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkvsw\" (UniqueName: \"kubernetes.io/projected/88c561c2-ba90-4331-8dc1-3098939b3b3c-kube-api-access-kkvsw\") pod \"kube-state-metrics-0\" (UID: \"88c561c2-ba90-4331-8dc1-3098939b3b3c\") " pod="openstack/kube-state-metrics-0" Mar 10 14:21:36 crc kubenswrapper[4911]: I0310 14:21:36.475535 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkvsw\" (UniqueName: \"kubernetes.io/projected/88c561c2-ba90-4331-8dc1-3098939b3b3c-kube-api-access-kkvsw\") pod \"kube-state-metrics-0\" (UID: \"88c561c2-ba90-4331-8dc1-3098939b3b3c\") " pod="openstack/kube-state-metrics-0" Mar 10 14:21:36 crc kubenswrapper[4911]: I0310 14:21:36.527792 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkvsw\" (UniqueName: \"kubernetes.io/projected/88c561c2-ba90-4331-8dc1-3098939b3b3c-kube-api-access-kkvsw\") pod \"kube-state-metrics-0\" (UID: \"88c561c2-ba90-4331-8dc1-3098939b3b3c\") " pod="openstack/kube-state-metrics-0" Mar 10 14:21:36 crc kubenswrapper[4911]: I0310 14:21:36.629051 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.311908 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9vssn"] Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.314131 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.316407 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.316671 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fxw9k" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.320575 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.327621 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9vssn"] Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.402418 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5hbsq"] Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.405778 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.416138 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5hbsq"] Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.440365 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4pg5\" (UniqueName: \"kubernetes.io/projected/e43fdd12-0361-428c-8318-d1cec1c95399-kube-api-access-b4pg5\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.440416 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43fdd12-0361-428c-8318-d1cec1c95399-combined-ca-bundle\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.440456 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-log-ovn\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.440488 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-run\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.442998 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-run-ovn\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.443069 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e43fdd12-0361-428c-8318-d1cec1c95399-scripts\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.443158 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e43fdd12-0361-428c-8318-d1cec1c95399-ovn-controller-tls-certs\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.455145 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745","Type":"ContainerStarted","Data":"2ff5d36b7e7aa0284f0d4da1fca571dccca2f9f1fee62e228da3d6186521c29a"} Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.544395 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e43fdd12-0361-428c-8318-d1cec1c95399-scripts\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.544471 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6941e0ca-8689-452e-82e4-d233cbbd45ec-scripts\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.544506 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e43fdd12-0361-428c-8318-d1cec1c95399-ovn-controller-tls-certs\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.544795 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-etc-ovs\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.544834 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-lib\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.545756 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cx2q\" (UniqueName: \"kubernetes.io/projected/6941e0ca-8689-452e-82e4-d233cbbd45ec-kube-api-access-9cx2q\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.545794 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4pg5\" (UniqueName: \"kubernetes.io/projected/e43fdd12-0361-428c-8318-d1cec1c95399-kube-api-access-b4pg5\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.545812 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43fdd12-0361-428c-8318-d1cec1c95399-combined-ca-bundle\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.545840 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-run\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.545862 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-log-ovn\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.545881 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-run\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.545898 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-log\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.545922 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-run-ovn\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.546337 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-run-ovn\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.546748 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e43fdd12-0361-428c-8318-d1cec1c95399-scripts\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.546939 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-log-ovn\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.547051 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e43fdd12-0361-428c-8318-d1cec1c95399-var-run\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.549870 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e43fdd12-0361-428c-8318-d1cec1c95399-ovn-controller-tls-certs\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.550259 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43fdd12-0361-428c-8318-d1cec1c95399-combined-ca-bundle\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.571882 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4pg5\" (UniqueName: \"kubernetes.io/projected/e43fdd12-0361-428c-8318-d1cec1c95399-kube-api-access-b4pg5\") pod \"ovn-controller-9vssn\" (UID: \"e43fdd12-0361-428c-8318-d1cec1c95399\") " pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.636131 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.646583 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cx2q\" (UniqueName: \"kubernetes.io/projected/6941e0ca-8689-452e-82e4-d233cbbd45ec-kube-api-access-9cx2q\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.646662 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-run\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.646698 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-log\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.646768 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6941e0ca-8689-452e-82e4-d233cbbd45ec-scripts\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.646813 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-etc-ovs\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.646827 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-run\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.646849 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-lib\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.647065 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-log\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.647142 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-var-lib\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.647210 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6941e0ca-8689-452e-82e4-d233cbbd45ec-etc-ovs\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.649204 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6941e0ca-8689-452e-82e4-d233cbbd45ec-scripts\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.667459 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cx2q\" (UniqueName: \"kubernetes.io/projected/6941e0ca-8689-452e-82e4-d233cbbd45ec-kube-api-access-9cx2q\") pod \"ovn-controller-ovs-5hbsq\" (UID: \"6941e0ca-8689-452e-82e4-d233cbbd45ec\") " pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:39 crc kubenswrapper[4911]: I0310 14:21:39.720642 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.217222 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.218590 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.221017 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9m9w7" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.221056 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.221216 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.221442 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.221515 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.240970 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.272315 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.272414 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a8355c9-0644-458c-9df7-bbbfd01fc249-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.272550 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8355c9-0644-458c-9df7-bbbfd01fc249-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.272613 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf6rj\" (UniqueName: \"kubernetes.io/projected/3a8355c9-0644-458c-9df7-bbbfd01fc249-kube-api-access-hf6rj\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.272660 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a8355c9-0644-458c-9df7-bbbfd01fc249-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.272705 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.272781 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.272819 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.374745 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.374812 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.374847 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.374881 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.374917 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a8355c9-0644-458c-9df7-bbbfd01fc249-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.374972 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8355c9-0644-458c-9df7-bbbfd01fc249-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.375008 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf6rj\" (UniqueName: \"kubernetes.io/projected/3a8355c9-0644-458c-9df7-bbbfd01fc249-kube-api-access-hf6rj\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.375051 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a8355c9-0644-458c-9df7-bbbfd01fc249-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.375206 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.375606 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a8355c9-0644-458c-9df7-bbbfd01fc249-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.376291 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8355c9-0644-458c-9df7-bbbfd01fc249-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.377624 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a8355c9-0644-458c-9df7-bbbfd01fc249-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.381279 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.382868 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.382912 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a8355c9-0644-458c-9df7-bbbfd01fc249-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.395134 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf6rj\" (UniqueName: \"kubernetes.io/projected/3a8355c9-0644-458c-9df7-bbbfd01fc249-kube-api-access-hf6rj\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.402929 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3a8355c9-0644-458c-9df7-bbbfd01fc249\") " pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:40 crc kubenswrapper[4911]: I0310 14:21:40.570484 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 14:21:41 crc kubenswrapper[4911]: I0310 14:21:41.986224 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.785486 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.787290 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.789023 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-bcxs5" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.790473 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.790912 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.791008 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.803256 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.859969 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.860043 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gglxt\" (UniqueName: \"kubernetes.io/projected/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-kube-api-access-gglxt\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.860084 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.860136 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.860187 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-config\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.860240 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.860348 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.860381 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.962435 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.962502 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-config\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.962529 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.962559 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.962582 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.962631 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.962663 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gglxt\" (UniqueName: \"kubernetes.io/projected/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-kube-api-access-gglxt\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.962693 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.963133 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.963806 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.964574 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-config\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.964665 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.968534 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.969091 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.971366 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:43 crc kubenswrapper[4911]: I0310 14:21:43.983158 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gglxt\" (UniqueName: \"kubernetes.io/projected/48ab1a9d-fcce-4cdf-8e73-cae7562b4269-kube-api-access-gglxt\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:44 crc kubenswrapper[4911]: I0310 14:21:44.002645 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48ab1a9d-fcce-4cdf-8e73-cae7562b4269\") " pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:44 crc kubenswrapper[4911]: I0310 14:21:44.117074 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 14:21:47 crc kubenswrapper[4911]: W0310 14:21:47.962833 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ff8ebc9_ea10_4e9c_be23_96608817ed84.slice/crio-f13a314bb79c91a1c6b9bf6dc973b92b860c0786a507252a6530b6c920a0b0d1 WatchSource:0}: Error finding container f13a314bb79c91a1c6b9bf6dc973b92b860c0786a507252a6530b6c920a0b0d1: Status 404 returned error can't find the container with id f13a314bb79c91a1c6b9bf6dc973b92b860c0786a507252a6530b6c920a0b0d1 Mar 10 14:21:48 crc kubenswrapper[4911]: I0310 14:21:48.403536 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 14:21:48 crc kubenswrapper[4911]: I0310 14:21:48.520613 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:21:48 crc kubenswrapper[4911]: I0310 14:21:48.520704 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:21:48 crc kubenswrapper[4911]: I0310 14:21:48.520822 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:21:48 crc kubenswrapper[4911]: I0310 14:21:48.521698 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b451ff8e7fd4c75aa2b34c26affcab379b47b137f2e280e6643a4a5092850d94"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:21:48 crc kubenswrapper[4911]: I0310 14:21:48.521785 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://b451ff8e7fd4c75aa2b34c26affcab379b47b137f2e280e6643a4a5092850d94" gracePeriod=600 Mar 10 14:21:48 crc kubenswrapper[4911]: I0310 14:21:48.542985 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5ff8ebc9-ea10-4e9c-be23-96608817ed84","Type":"ContainerStarted","Data":"f13a314bb79c91a1c6b9bf6dc973b92b860c0786a507252a6530b6c920a0b0d1"} Mar 10 14:21:49 crc kubenswrapper[4911]: I0310 14:21:49.556378 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="b451ff8e7fd4c75aa2b34c26affcab379b47b137f2e280e6643a4a5092850d94" exitCode=0 Mar 10 14:21:49 crc kubenswrapper[4911]: I0310 14:21:49.556431 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"b451ff8e7fd4c75aa2b34c26affcab379b47b137f2e280e6643a4a5092850d94"} Mar 10 14:21:49 crc kubenswrapper[4911]: I0310 14:21:49.556479 4911 scope.go:117] "RemoveContainer" containerID="064f54de59fb1087deb1f06362fea8b7318f6c645504d0d54010b3ae33528b2f" Mar 10 14:21:53 crc kubenswrapper[4911]: E0310 14:21:53.659792 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 14:21:53 crc kubenswrapper[4911]: E0310 14:21:53.660406 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xpsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fp6h2_openstack(13fa709c-20ac-4e81-ad9e-6755042b11ef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:21:53 crc kubenswrapper[4911]: E0310 14:21:53.661903 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" podUID="13fa709c-20ac-4e81-ad9e-6755042b11ef" Mar 10 14:21:53 crc kubenswrapper[4911]: E0310 14:21:53.667289 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 14:21:53 crc kubenswrapper[4911]: E0310 14:21:53.667468 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gv4mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-h5xdw_openstack(d536338c-ff72-4f8c-99c6-59b3feba5cf8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:21:53 crc kubenswrapper[4911]: E0310 14:21:53.668666 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.595657 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.692427 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.692604 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jgpfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(afa5978d-b0b8-4edb-b3ca-27b7bb1ee745): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.693818 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.730907 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.731397 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2kr66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(8d1a8c73-283d-431f-bfd3-af06ca3c60ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.732578 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.797151 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.797394 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lmfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-ffg46_openstack(6076d902-fab7-408a-a6ca-1d2fb6d9e5aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.798571 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" podUID="6076d902-fab7-408a-a6ca-1d2fb6d9e5aa" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.812594 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.812810 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hpssc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-2hzb7_openstack(b7f7e715-7aa2-4903-b925-3763fee5918d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:21:54 crc kubenswrapper[4911]: E0310 14:21:54.813975 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" podUID="b7f7e715-7aa2-4903-b925-3763fee5918d" Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.000663 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.192590 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fa709c-20ac-4e81-ad9e-6755042b11ef-config\") pod \"13fa709c-20ac-4e81-ad9e-6755042b11ef\" (UID: \"13fa709c-20ac-4e81-ad9e-6755042b11ef\") " Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.192809 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xpsp\" (UniqueName: \"kubernetes.io/projected/13fa709c-20ac-4e81-ad9e-6755042b11ef-kube-api-access-4xpsp\") pod \"13fa709c-20ac-4e81-ad9e-6755042b11ef\" (UID: \"13fa709c-20ac-4e81-ad9e-6755042b11ef\") " Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.193266 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13fa709c-20ac-4e81-ad9e-6755042b11ef-config" (OuterVolumeSpecName: "config") pod "13fa709c-20ac-4e81-ad9e-6755042b11ef" (UID: "13fa709c-20ac-4e81-ad9e-6755042b11ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.194063 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fa709c-20ac-4e81-ad9e-6755042b11ef-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.198682 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13fa709c-20ac-4e81-ad9e-6755042b11ef-kube-api-access-4xpsp" (OuterVolumeSpecName: "kube-api-access-4xpsp") pod "13fa709c-20ac-4e81-ad9e-6755042b11ef" (UID: "13fa709c-20ac-4e81-ad9e-6755042b11ef"). InnerVolumeSpecName "kube-api-access-4xpsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.255415 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:21:55 crc kubenswrapper[4911]: W0310 14:21:55.261017 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88c561c2_ba90_4331_8dc1_3098939b3b3c.slice/crio-ab53cbb0a833a25a3edf807934855367260b314422a17469ab0934b7247e5928 WatchSource:0}: Error finding container ab53cbb0a833a25a3edf807934855367260b314422a17469ab0934b7247e5928: Status 404 returned error can't find the container with id ab53cbb0a833a25a3edf807934855367260b314422a17469ab0934b7247e5928 Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.295623 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xpsp\" (UniqueName: \"kubernetes.io/projected/13fa709c-20ac-4e81-ad9e-6755042b11ef-kube-api-access-4xpsp\") on node \"crc\" DevicePath \"\"" Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.540650 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.551534 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9vssn"] Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.591020 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.601674 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"19d28b4207c776d043f5f0d2450f7371800625af7e9dbf7c4bc17586e1f99a7f"} Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.604269 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" event={"ID":"13fa709c-20ac-4e81-ad9e-6755042b11ef","Type":"ContainerDied","Data":"1002101d2b25278372b834a865f6d02f5af8388bcb9b3134a23beddc9ff0c8f8"} Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.604307 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fp6h2" Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.605318 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"88c561c2-ba90-4331-8dc1-3098939b3b3c","Type":"ContainerStarted","Data":"ab53cbb0a833a25a3edf807934855367260b314422a17469ab0934b7247e5928"} Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.606375 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"45599797-9a4e-428b-8f95-39b6db7bd84e","Type":"ContainerStarted","Data":"17949eb0844c7c944d871163d4703ff0ee8b53eff6ccd90473c170af45e4d37e"} Mar 10 14:21:55 crc kubenswrapper[4911]: E0310 14:21:55.608142 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" Mar 10 14:21:55 crc kubenswrapper[4911]: E0310 14:21:55.608524 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" Mar 10 14:21:55 crc kubenswrapper[4911]: E0310 14:21:55.608616 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" podUID="6076d902-fab7-408a-a6ca-1d2fb6d9e5aa" Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.763466 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fp6h2"] Mar 10 14:21:55 crc kubenswrapper[4911]: I0310 14:21:55.770358 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fp6h2"] Mar 10 14:21:56 crc kubenswrapper[4911]: I0310 14:21:56.205213 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13fa709c-20ac-4e81-ad9e-6755042b11ef" path="/var/lib/kubelet/pods/13fa709c-20ac-4e81-ad9e-6755042b11ef/volumes" Mar 10 14:21:56 crc kubenswrapper[4911]: I0310 14:21:56.328043 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 14:21:56 crc kubenswrapper[4911]: I0310 14:21:56.479872 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5hbsq"] Mar 10 14:21:57 crc kubenswrapper[4911]: W0310 14:21:57.574450 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a8355c9_0644_458c_9df7_bbbfd01fc249.slice/crio-1954f99aaf879f4eb464a801c29cb8e143b24dafd79f0b629d49f192c7f8d290 WatchSource:0}: Error finding container 1954f99aaf879f4eb464a801c29cb8e143b24dafd79f0b629d49f192c7f8d290: Status 404 returned error can't find the container with id 1954f99aaf879f4eb464a801c29cb8e143b24dafd79f0b629d49f192c7f8d290 Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.620165 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.624411 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48ab1a9d-fcce-4cdf-8e73-cae7562b4269","Type":"ContainerStarted","Data":"614c6c76464bdcd60254c3574770595a84b372bb451dc9516762c5ee4b45ef36"} Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.626105 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn" event={"ID":"e43fdd12-0361-428c-8318-d1cec1c95399","Type":"ContainerStarted","Data":"3861f72139c507a6801815981a8690e1e67f1f25d2f691a889a3ba819195f820"} Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.627232 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.627268 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2hzb7" event={"ID":"b7f7e715-7aa2-4903-b925-3763fee5918d","Type":"ContainerDied","Data":"fb0b5c3cc30b455b99d2510c4b52eec20ed55a04e539f7259f23f58169cdb142"} Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.628519 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e7efec5-8494-472d-b149-a6aeed4810b2","Type":"ContainerStarted","Data":"5b5ad3b84d69e30287d4c907b447cf0b8c22d92b29852db8a8a9ed5c383a7b3a"} Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.629716 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a8355c9-0644-458c-9df7-bbbfd01fc249","Type":"ContainerStarted","Data":"1954f99aaf879f4eb464a801c29cb8e143b24dafd79f0b629d49f192c7f8d290"} Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.756970 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-config\") pod \"b7f7e715-7aa2-4903-b925-3763fee5918d\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.757136 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpssc\" (UniqueName: \"kubernetes.io/projected/b7f7e715-7aa2-4903-b925-3763fee5918d-kube-api-access-hpssc\") pod \"b7f7e715-7aa2-4903-b925-3763fee5918d\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.757184 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-dns-svc\") pod \"b7f7e715-7aa2-4903-b925-3763fee5918d\" (UID: \"b7f7e715-7aa2-4903-b925-3763fee5918d\") " Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.757578 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-config" (OuterVolumeSpecName: "config") pod "b7f7e715-7aa2-4903-b925-3763fee5918d" (UID: "b7f7e715-7aa2-4903-b925-3763fee5918d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.757661 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7f7e715-7aa2-4903-b925-3763fee5918d" (UID: "b7f7e715-7aa2-4903-b925-3763fee5918d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.757756 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.765712 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f7e715-7aa2-4903-b925-3763fee5918d-kube-api-access-hpssc" (OuterVolumeSpecName: "kube-api-access-hpssc") pod "b7f7e715-7aa2-4903-b925-3763fee5918d" (UID: "b7f7e715-7aa2-4903-b925-3763fee5918d"). InnerVolumeSpecName "kube-api-access-hpssc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.859747 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7f7e715-7aa2-4903-b925-3763fee5918d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.859779 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpssc\" (UniqueName: \"kubernetes.io/projected/b7f7e715-7aa2-4903-b925-3763fee5918d-kube-api-access-hpssc\") on node \"crc\" DevicePath \"\"" Mar 10 14:21:57 crc kubenswrapper[4911]: I0310 14:21:57.994887 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2hzb7"] Mar 10 14:21:58 crc kubenswrapper[4911]: I0310 14:21:58.007814 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2hzb7"] Mar 10 14:21:58 crc kubenswrapper[4911]: I0310 14:21:58.203384 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7f7e715-7aa2-4903-b925-3763fee5918d" path="/var/lib/kubelet/pods/b7f7e715-7aa2-4903-b925-3763fee5918d/volumes" Mar 10 14:21:58 crc kubenswrapper[4911]: I0310 14:21:58.640146 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5hbsq" event={"ID":"6941e0ca-8689-452e-82e4-d233cbbd45ec","Type":"ContainerStarted","Data":"557f9a524a1f44bf2784b68fc3d7cccdf92605c31e8c474ee239a5067642c1fb"} Mar 10 14:21:59 crc kubenswrapper[4911]: I0310 14:21:59.654283 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5ff8ebc9-ea10-4e9c-be23-96608817ed84","Type":"ContainerStarted","Data":"a90684fba0e27fdf9c64f9287ba1c10aed2217a39552959903dafc5051fcf306"} Mar 10 14:21:59 crc kubenswrapper[4911]: I0310 14:21:59.658420 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"45599797-9a4e-428b-8f95-39b6db7bd84e","Type":"ContainerStarted","Data":"66a6cdd5fbb407cf1dc66790b8a5c7563b9ab6748c2816417c3a0beafcb2fe55"} Mar 10 14:21:59 crc kubenswrapper[4911]: I0310 14:21:59.658516 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 14:21:59 crc kubenswrapper[4911]: I0310 14:21:59.660526 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e7efec5-8494-472d-b149-a6aeed4810b2","Type":"ContainerStarted","Data":"f35ada5cf6d12529f9001f090191d7475148024bad4455602bfe0724f4a8e39f"} Mar 10 14:21:59 crc kubenswrapper[4911]: I0310 14:21:59.698848 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.913759274 podStartE2EDuration="25.698813225s" podCreationTimestamp="2026-03-10 14:21:34 +0000 UTC" firstStartedPulling="2026-03-10 14:21:54.707010283 +0000 UTC m=+1219.270530200" lastFinishedPulling="2026-03-10 14:21:58.492064224 +0000 UTC m=+1223.055584151" observedRunningTime="2026-03-10 14:21:59.698412845 +0000 UTC m=+1224.261932762" watchObservedRunningTime="2026-03-10 14:21:59.698813225 +0000 UTC m=+1224.262333142" Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.156318 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552542-f989r"] Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.161015 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552542-f989r" Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.162740 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552542-f989r"] Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.166801 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.167129 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.167283 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.329771 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt7mw\" (UniqueName: \"kubernetes.io/projected/5101cda8-039d-4775-81dc-0923e9e3e089-kube-api-access-tt7mw\") pod \"auto-csr-approver-29552542-f989r\" (UID: \"5101cda8-039d-4775-81dc-0923e9e3e089\") " pod="openshift-infra/auto-csr-approver-29552542-f989r" Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.431406 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt7mw\" (UniqueName: \"kubernetes.io/projected/5101cda8-039d-4775-81dc-0923e9e3e089-kube-api-access-tt7mw\") pod \"auto-csr-approver-29552542-f989r\" (UID: \"5101cda8-039d-4775-81dc-0923e9e3e089\") " pod="openshift-infra/auto-csr-approver-29552542-f989r" Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.493455 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt7mw\" (UniqueName: \"kubernetes.io/projected/5101cda8-039d-4775-81dc-0923e9e3e089-kube-api-access-tt7mw\") pod \"auto-csr-approver-29552542-f989r\" (UID: \"5101cda8-039d-4775-81dc-0923e9e3e089\") " pod="openshift-infra/auto-csr-approver-29552542-f989r" Mar 10 14:22:00 crc kubenswrapper[4911]: I0310 14:22:00.785514 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552542-f989r" Mar 10 14:22:02 crc kubenswrapper[4911]: E0310 14:22:02.546779 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ff8ebc9_ea10_4e9c_be23_96608817ed84.slice/crio-a90684fba0e27fdf9c64f9287ba1c10aed2217a39552959903dafc5051fcf306.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ff8ebc9_ea10_4e9c_be23_96608817ed84.slice/crio-conmon-a90684fba0e27fdf9c64f9287ba1c10aed2217a39552959903dafc5051fcf306.scope\": RecentStats: unable to find data in memory cache]" Mar 10 14:22:02 crc kubenswrapper[4911]: I0310 14:22:02.711432 4911 generic.go:334] "Generic (PLEG): container finished" podID="5ff8ebc9-ea10-4e9c-be23-96608817ed84" containerID="a90684fba0e27fdf9c64f9287ba1c10aed2217a39552959903dafc5051fcf306" exitCode=0 Mar 10 14:22:02 crc kubenswrapper[4911]: I0310 14:22:02.711529 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5ff8ebc9-ea10-4e9c-be23-96608817ed84","Type":"ContainerDied","Data":"a90684fba0e27fdf9c64f9287ba1c10aed2217a39552959903dafc5051fcf306"} Mar 10 14:22:02 crc kubenswrapper[4911]: I0310 14:22:02.744323 4911 generic.go:334] "Generic (PLEG): container finished" podID="6e7efec5-8494-472d-b149-a6aeed4810b2" containerID="f35ada5cf6d12529f9001f090191d7475148024bad4455602bfe0724f4a8e39f" exitCode=0 Mar 10 14:22:02 crc kubenswrapper[4911]: I0310 14:22:02.744388 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e7efec5-8494-472d-b149-a6aeed4810b2","Type":"ContainerDied","Data":"f35ada5cf6d12529f9001f090191d7475148024bad4455602bfe0724f4a8e39f"} Mar 10 14:22:02 crc kubenswrapper[4911]: I0310 14:22:02.876601 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.032331 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552542-f989r"] Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.754386 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552542-f989r" event={"ID":"5101cda8-039d-4775-81dc-0923e9e3e089","Type":"ContainerStarted","Data":"40c1b1b374f7daa0c1e10d4e2d37d1f270e0203c39c6d66a5093d6a77050dd3b"} Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.758277 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn" event={"ID":"e43fdd12-0361-428c-8318-d1cec1c95399","Type":"ContainerStarted","Data":"90a883e0e4cdf83fede825a9f8e0093488283e741e0bc93165494b19c643ca4e"} Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.758450 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9vssn" Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.762972 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e7efec5-8494-472d-b149-a6aeed4810b2","Type":"ContainerStarted","Data":"7c39513e3162709a0f581d344b49680f2511581a1bed07c7b936d3cab3c9cf9a"} Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.765216 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a8355c9-0644-458c-9df7-bbbfd01fc249","Type":"ContainerStarted","Data":"d4f5b1830a6ade78175aa269391f72f82975ed6439116377356d6cf32db5b84c"} Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.768797 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5ff8ebc9-ea10-4e9c-be23-96608817ed84","Type":"ContainerStarted","Data":"51ab920999f393709db6b2eaddd87434c7e0281ac1aaf1e9d0064e963a4ccb3b"} Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.772247 4911 generic.go:334] "Generic (PLEG): container finished" podID="6941e0ca-8689-452e-82e4-d233cbbd45ec" containerID="07db6c57ea3b80260652a3bccd5ea77311c59b792805006f23e3bb3368d14cfd" exitCode=0 Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.772327 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5hbsq" event={"ID":"6941e0ca-8689-452e-82e4-d233cbbd45ec","Type":"ContainerDied","Data":"07db6c57ea3b80260652a3bccd5ea77311c59b792805006f23e3bb3368d14cfd"} Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.782971 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9vssn" podStartSLOduration=19.768595845 podStartE2EDuration="24.782941941s" podCreationTimestamp="2026-03-10 14:21:39 +0000 UTC" firstStartedPulling="2026-03-10 14:21:57.567135686 +0000 UTC m=+1222.130655593" lastFinishedPulling="2026-03-10 14:22:02.581481772 +0000 UTC m=+1227.145001689" observedRunningTime="2026-03-10 14:22:03.781007643 +0000 UTC m=+1228.344527550" watchObservedRunningTime="2026-03-10 14:22:03.782941941 +0000 UTC m=+1228.346461848" Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.783478 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"88c561c2-ba90-4331-8dc1-3098939b3b3c","Type":"ContainerStarted","Data":"e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0"} Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.783652 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.802519 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48ab1a9d-fcce-4cdf-8e73-cae7562b4269","Type":"ContainerStarted","Data":"0f2b5920cec767d1480c2b2ca0958205df2ba02631aab992ef39695d99dd4cbb"} Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.810299 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.288108495 podStartE2EDuration="32.810273894s" podCreationTimestamp="2026-03-10 14:21:31 +0000 UTC" firstStartedPulling="2026-03-10 14:21:47.967896805 +0000 UTC m=+1212.531416722" lastFinishedPulling="2026-03-10 14:21:58.490062194 +0000 UTC m=+1223.053582121" observedRunningTime="2026-03-10 14:22:03.8089241 +0000 UTC m=+1228.372444027" watchObservedRunningTime="2026-03-10 14:22:03.810273894 +0000 UTC m=+1228.373793811" Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.834729 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.927540629 podStartE2EDuration="31.834496948s" podCreationTimestamp="2026-03-10 14:21:32 +0000 UTC" firstStartedPulling="2026-03-10 14:21:57.585397872 +0000 UTC m=+1222.148917789" lastFinishedPulling="2026-03-10 14:21:58.492354191 +0000 UTC m=+1223.055874108" observedRunningTime="2026-03-10 14:22:03.831433682 +0000 UTC m=+1228.394953599" watchObservedRunningTime="2026-03-10 14:22:03.834496948 +0000 UTC m=+1228.398016865" Mar 10 14:22:03 crc kubenswrapper[4911]: I0310 14:22:03.878840 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=20.55374172 podStartE2EDuration="27.878812694s" podCreationTimestamp="2026-03-10 14:21:36 +0000 UTC" firstStartedPulling="2026-03-10 14:21:55.267259228 +0000 UTC m=+1219.830779145" lastFinishedPulling="2026-03-10 14:22:02.592330202 +0000 UTC m=+1227.155850119" observedRunningTime="2026-03-10 14:22:03.872921247 +0000 UTC m=+1228.436441164" watchObservedRunningTime="2026-03-10 14:22:03.878812694 +0000 UTC m=+1228.442332601" Mar 10 14:22:04 crc kubenswrapper[4911]: I0310 14:22:04.249832 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 14:22:04 crc kubenswrapper[4911]: I0310 14:22:04.250213 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 14:22:04 crc kubenswrapper[4911]: I0310 14:22:04.494452 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 14:22:04 crc kubenswrapper[4911]: I0310 14:22:04.818712 4911 generic.go:334] "Generic (PLEG): container finished" podID="5101cda8-039d-4775-81dc-0923e9e3e089" containerID="fd545a046f233cba0f5f1b9a0d3b1abc53c9050872947bbd13a4722c17fdb802" exitCode=0 Mar 10 14:22:04 crc kubenswrapper[4911]: I0310 14:22:04.819231 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552542-f989r" event={"ID":"5101cda8-039d-4775-81dc-0923e9e3e089","Type":"ContainerDied","Data":"fd545a046f233cba0f5f1b9a0d3b1abc53c9050872947bbd13a4722c17fdb802"} Mar 10 14:22:04 crc kubenswrapper[4911]: I0310 14:22:04.825831 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5hbsq" event={"ID":"6941e0ca-8689-452e-82e4-d233cbbd45ec","Type":"ContainerStarted","Data":"04e5b1113a209f7e24a4002628bd4fd081971af0b2a9670500ba3f47bea23bbd"} Mar 10 14:22:04 crc kubenswrapper[4911]: I0310 14:22:04.825893 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5hbsq" event={"ID":"6941e0ca-8689-452e-82e4-d233cbbd45ec","Type":"ContainerStarted","Data":"cf0fddf737513057fbe800df639b106b090cee9d1627f321bb5ae9e8cf3957ac"} Mar 10 14:22:04 crc kubenswrapper[4911]: I0310 14:22:04.862020 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5hbsq" podStartSLOduration=21.598091752 podStartE2EDuration="25.861995496s" podCreationTimestamp="2026-03-10 14:21:39 +0000 UTC" firstStartedPulling="2026-03-10 14:21:58.306171633 +0000 UTC m=+1222.869691550" lastFinishedPulling="2026-03-10 14:22:02.570075377 +0000 UTC m=+1227.133595294" observedRunningTime="2026-03-10 14:22:04.853986126 +0000 UTC m=+1229.417506053" watchObservedRunningTime="2026-03-10 14:22:04.861995496 +0000 UTC m=+1229.425515413" Mar 10 14:22:05 crc kubenswrapper[4911]: I0310 14:22:05.838519 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:22:05 crc kubenswrapper[4911]: I0310 14:22:05.838609 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.666570 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ffg46"] Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.722543 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-n48zq"] Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.724163 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.748995 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-n48zq"] Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.890425 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlk9h\" (UniqueName: \"kubernetes.io/projected/efd1ceba-73de-4071-a519-fd080657ef9d-kube-api-access-jlk9h\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.890482 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.890574 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-config\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.992381 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-config\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.992500 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlk9h\" (UniqueName: \"kubernetes.io/projected/efd1ceba-73de-4071-a519-fd080657ef9d-kube-api-access-jlk9h\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.992535 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:06 crc kubenswrapper[4911]: I0310 14:22:06.994554 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-config\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:06.998643 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.019934 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlk9h\" (UniqueName: \"kubernetes.io/projected/efd1ceba-73de-4071-a519-fd080657ef9d-kube-api-access-jlk9h\") pod \"dnsmasq-dns-7cb5889db5-n48zq\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.050424 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.770644 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552542-f989r" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.776538 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.880941 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552542-f989r" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.881011 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552542-f989r" event={"ID":"5101cda8-039d-4775-81dc-0923e9e3e089","Type":"ContainerDied","Data":"40c1b1b374f7daa0c1e10d4e2d37d1f270e0203c39c6d66a5093d6a77050dd3b"} Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.881094 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40c1b1b374f7daa0c1e10d4e2d37d1f270e0203c39c6d66a5093d6a77050dd3b" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.883965 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 14:22:07 crc kubenswrapper[4911]: E0310 14:22:07.884577 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5101cda8-039d-4775-81dc-0923e9e3e089" containerName="oc" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.884694 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5101cda8-039d-4775-81dc-0923e9e3e089" containerName="oc" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.884969 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.885129 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5101cda8-039d-4775-81dc-0923e9e3e089" containerName="oc" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.892328 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ffg46" event={"ID":"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa","Type":"ContainerDied","Data":"d5e5d356eff1f14a5218d82602074c306de5d67b6541908ce44c1cf19cd8f5e9"} Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.892643 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.899419 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.899864 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.900039 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-mlhkm" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.900150 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.916280 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-config\") pod \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.916447 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lmfp\" (UniqueName: \"kubernetes.io/projected/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-kube-api-access-9lmfp\") pod \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.916522 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt7mw\" (UniqueName: \"kubernetes.io/projected/5101cda8-039d-4775-81dc-0923e9e3e089-kube-api-access-tt7mw\") pod \"5101cda8-039d-4775-81dc-0923e9e3e089\" (UID: \"5101cda8-039d-4775-81dc-0923e9e3e089\") " Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.916589 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-dns-svc\") pod \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\" (UID: \"6076d902-fab7-408a-a6ca-1d2fb6d9e5aa\") " Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.916848 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-config" (OuterVolumeSpecName: "config") pod "6076d902-fab7-408a-a6ca-1d2fb6d9e5aa" (UID: "6076d902-fab7-408a-a6ca-1d2fb6d9e5aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.917091 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.917252 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.917708 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6076d902-fab7-408a-a6ca-1d2fb6d9e5aa" (UID: "6076d902-fab7-408a-a6ca-1d2fb6d9e5aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.921275 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-kube-api-access-9lmfp" (OuterVolumeSpecName: "kube-api-access-9lmfp") pod "6076d902-fab7-408a-a6ca-1d2fb6d9e5aa" (UID: "6076d902-fab7-408a-a6ca-1d2fb6d9e5aa"). InnerVolumeSpecName "kube-api-access-9lmfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:07 crc kubenswrapper[4911]: I0310 14:22:07.926946 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5101cda8-039d-4775-81dc-0923e9e3e089-kube-api-access-tt7mw" (OuterVolumeSpecName: "kube-api-access-tt7mw") pod "5101cda8-039d-4775-81dc-0923e9e3e089" (UID: "5101cda8-039d-4775-81dc-0923e9e3e089"). InnerVolumeSpecName "kube-api-access-tt7mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.018870 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca33eab-721d-4858-8e23-9ffc6371926f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.019217 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2ca33eab-721d-4858-8e23-9ffc6371926f-lock\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.019267 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.019308 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2ca33eab-721d-4858-8e23-9ffc6371926f-cache\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.019570 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.019689 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtd42\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-kube-api-access-dtd42\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.019866 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lmfp\" (UniqueName: \"kubernetes.io/projected/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-kube-api-access-9lmfp\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.019888 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt7mw\" (UniqueName: \"kubernetes.io/projected/5101cda8-039d-4775-81dc-0923e9e3e089-kube-api-access-tt7mw\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.019904 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.121797 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2ca33eab-721d-4858-8e23-9ffc6371926f-cache\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.121913 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.121959 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtd42\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-kube-api-access-dtd42\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.122025 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca33eab-721d-4858-8e23-9ffc6371926f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.122049 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2ca33eab-721d-4858-8e23-9ffc6371926f-lock\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.122096 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: E0310 14:22:08.122273 4911 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 14:22:08 crc kubenswrapper[4911]: E0310 14:22:08.122289 4911 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 14:22:08 crc kubenswrapper[4911]: E0310 14:22:08.122795 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift podName:2ca33eab-721d-4858-8e23-9ffc6371926f nodeName:}" failed. No retries permitted until 2026-03-10 14:22:08.62277035 +0000 UTC m=+1233.186290267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift") pod "swift-storage-0" (UID: "2ca33eab-721d-4858-8e23-9ffc6371926f") : configmap "swift-ring-files" not found Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.123031 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.123247 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2ca33eab-721d-4858-8e23-9ffc6371926f-lock\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.124006 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2ca33eab-721d-4858-8e23-9ffc6371926f-cache\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.128915 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca33eab-721d-4858-8e23-9ffc6371926f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.141300 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtd42\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-kube-api-access-dtd42\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.179651 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.327416 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ffg46"] Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.341224 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-n48zq"] Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.350808 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ffg46"] Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.506266 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-hkqgc"] Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.508244 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.513483 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.513644 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.513761 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.523088 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hkqgc"] Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.630975 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-swiftconf\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.631031 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-etc-swift\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.631061 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-scripts\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.631155 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.631179 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-combined-ca-bundle\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.631204 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjb5w\" (UniqueName: \"kubernetes.io/projected/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-kube-api-access-hjb5w\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.631284 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-ring-data-devices\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: E0310 14:22:08.631375 4911 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 14:22:08 crc kubenswrapper[4911]: E0310 14:22:08.631401 4911 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 14:22:08 crc kubenswrapper[4911]: E0310 14:22:08.631477 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift podName:2ca33eab-721d-4858-8e23-9ffc6371926f nodeName:}" failed. No retries permitted until 2026-03-10 14:22:09.631449708 +0000 UTC m=+1234.194969815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift") pod "swift-storage-0" (UID: "2ca33eab-721d-4858-8e23-9ffc6371926f") : configmap "swift-ring-files" not found Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.631539 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-dispersionconf\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.733028 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-combined-ca-bundle\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.733096 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjb5w\" (UniqueName: \"kubernetes.io/projected/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-kube-api-access-hjb5w\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.733151 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-ring-data-devices\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.733194 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-dispersionconf\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.733238 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-swiftconf\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.733264 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-etc-swift\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.733286 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-scripts\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.734235 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-scripts\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.734248 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-etc-swift\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.734512 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-ring-data-devices\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.736995 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-combined-ca-bundle\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.738558 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-dispersionconf\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.750667 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjb5w\" (UniqueName: \"kubernetes.io/projected/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-kube-api-access-hjb5w\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.824298 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-swiftconf\") pod \"swift-ring-rebalance-hkqgc\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.830063 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.844580 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552536-jnbwk"] Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.851251 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552536-jnbwk"] Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.896416 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" event={"ID":"d536338c-ff72-4f8c-99c6-59b3feba5cf8","Type":"ContainerStarted","Data":"64a0ced6ce02cdc337613105a305d24e24da976dfa9c6923d14c1e235ec681eb"} Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.901640 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a8355c9-0644-458c-9df7-bbbfd01fc249","Type":"ContainerStarted","Data":"c517c0aefe68d0203eda31a6b649c275562d625e8eefb18b0590f7add1d9a1d1"} Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.904121 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" event={"ID":"efd1ceba-73de-4071-a519-fd080657ef9d","Type":"ContainerStarted","Data":"d8708ec781ec1aad367c505088173ad10a832b925d9bb79498274f727d8c3660"} Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.909070 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48ab1a9d-fcce-4cdf-8e73-cae7562b4269","Type":"ContainerStarted","Data":"0487fcaf0eb0b545d8427116cfb6528e7408fcddecaf1ab6a2fd03ebda99c5af"} Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.948181 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.707260345 podStartE2EDuration="29.948151903s" podCreationTimestamp="2026-03-10 14:21:39 +0000 UTC" firstStartedPulling="2026-03-10 14:21:57.577034893 +0000 UTC m=+1222.140554810" lastFinishedPulling="2026-03-10 14:22:07.817926451 +0000 UTC m=+1232.381446368" observedRunningTime="2026-03-10 14:22:08.944900392 +0000 UTC m=+1233.508420309" watchObservedRunningTime="2026-03-10 14:22:08.948151903 +0000 UTC m=+1233.511671820" Mar 10 14:22:08 crc kubenswrapper[4911]: I0310 14:22:08.968497 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.748994917 podStartE2EDuration="26.96846767s" podCreationTimestamp="2026-03-10 14:21:42 +0000 UTC" firstStartedPulling="2026-03-10 14:21:57.585141285 +0000 UTC m=+1222.148661202" lastFinishedPulling="2026-03-10 14:22:07.804614048 +0000 UTC m=+1232.368133955" observedRunningTime="2026-03-10 14:22:08.965414204 +0000 UTC m=+1233.528934131" watchObservedRunningTime="2026-03-10 14:22:08.96846767 +0000 UTC m=+1233.531987587" Mar 10 14:22:09 crc kubenswrapper[4911]: I0310 14:22:09.118139 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 14:22:09 crc kubenswrapper[4911]: I0310 14:22:09.305860 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hkqgc"] Mar 10 14:22:09 crc kubenswrapper[4911]: W0310 14:22:09.326257 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cdd5d89_ce0c_4bce_90bf_f4a7c4e8c46a.slice/crio-c2808e91aeda496662e8149373b3ba99eff11e1dca92a4fc36463d888b313966 WatchSource:0}: Error finding container c2808e91aeda496662e8149373b3ba99eff11e1dca92a4fc36463d888b313966: Status 404 returned error can't find the container with id c2808e91aeda496662e8149373b3ba99eff11e1dca92a4fc36463d888b313966 Mar 10 14:22:09 crc kubenswrapper[4911]: I0310 14:22:09.653821 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:09 crc kubenswrapper[4911]: E0310 14:22:09.654077 4911 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 14:22:09 crc kubenswrapper[4911]: E0310 14:22:09.654410 4911 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 14:22:09 crc kubenswrapper[4911]: E0310 14:22:09.654556 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift podName:2ca33eab-721d-4858-8e23-9ffc6371926f nodeName:}" failed. No retries permitted until 2026-03-10 14:22:11.654536226 +0000 UTC m=+1236.218056133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift") pod "swift-storage-0" (UID: "2ca33eab-721d-4858-8e23-9ffc6371926f") : configmap "swift-ring-files" not found Mar 10 14:22:09 crc kubenswrapper[4911]: I0310 14:22:09.990333 4911 generic.go:334] "Generic (PLEG): container finished" podID="efd1ceba-73de-4071-a519-fd080657ef9d" containerID="f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619" exitCode=0 Mar 10 14:22:09 crc kubenswrapper[4911]: I0310 14:22:09.990459 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" event={"ID":"efd1ceba-73de-4071-a519-fd080657ef9d","Type":"ContainerDied","Data":"f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619"} Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.018180 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hkqgc" event={"ID":"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a","Type":"ContainerStarted","Data":"c2808e91aeda496662e8149373b3ba99eff11e1dca92a4fc36463d888b313966"} Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.020303 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745","Type":"ContainerStarted","Data":"6a0ccf7ab32bbdd3db16a7cb4519b544bcfd41ff2a2f3f99c3689e09324d90f1"} Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.044431 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d1a8c73-283d-431f-bfd3-af06ca3c60ff","Type":"ContainerStarted","Data":"59082a05080778cdaeafa7755b2401054688c324364ece5a1d2c398ee3e47500"} Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.084238 4911 generic.go:334] "Generic (PLEG): container finished" podID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" containerID="64a0ced6ce02cdc337613105a305d24e24da976dfa9c6923d14c1e235ec681eb" exitCode=0 Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.085154 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" event={"ID":"d536338c-ff72-4f8c-99c6-59b3feba5cf8","Type":"ContainerDied","Data":"64a0ced6ce02cdc337613105a305d24e24da976dfa9c6923d14c1e235ec681eb"} Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.204006 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6076d902-fab7-408a-a6ca-1d2fb6d9e5aa" path="/var/lib/kubelet/pods/6076d902-fab7-408a-a6ca-1d2fb6d9e5aa/volumes" Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.204965 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1f4b1e-ab3f-4d2b-b2d1-185188a4795c" path="/var/lib/kubelet/pods/af1f4b1e-ab3f-4d2b-b2d1-185188a4795c/volumes" Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.372191 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.462697 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.571863 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.571936 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 14:22:10 crc kubenswrapper[4911]: I0310 14:22:10.627055 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.101260 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" event={"ID":"d536338c-ff72-4f8c-99c6-59b3feba5cf8","Type":"ContainerStarted","Data":"7154bb935c0dacd7db86707576837f2203df2738c1c0b05a4ad50dc86eb4f810"} Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.101645 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.107805 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" event={"ID":"efd1ceba-73de-4071-a519-fd080657ef9d","Type":"ContainerStarted","Data":"59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93"} Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.109520 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.117841 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.131829 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" podStartSLOduration=3.814855111 podStartE2EDuration="41.131804601s" podCreationTimestamp="2026-03-10 14:21:30 +0000 UTC" firstStartedPulling="2026-03-10 14:21:31.326689662 +0000 UTC m=+1195.890209579" lastFinishedPulling="2026-03-10 14:22:08.643639162 +0000 UTC m=+1233.207159069" observedRunningTime="2026-03-10 14:22:11.126807856 +0000 UTC m=+1235.690327773" watchObservedRunningTime="2026-03-10 14:22:11.131804601 +0000 UTC m=+1235.695324508" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.162165 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" podStartSLOduration=4.526661055 podStartE2EDuration="5.162142648s" podCreationTimestamp="2026-03-10 14:22:06 +0000 UTC" firstStartedPulling="2026-03-10 14:22:08.344034393 +0000 UTC m=+1232.907554310" lastFinishedPulling="2026-03-10 14:22:08.979515986 +0000 UTC m=+1233.543035903" observedRunningTime="2026-03-10 14:22:11.153544893 +0000 UTC m=+1235.717064810" watchObservedRunningTime="2026-03-10 14:22:11.162142648 +0000 UTC m=+1235.725662565" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.206642 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.225629 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.496785 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5xdw"] Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.530713 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-m56p6"] Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.532698 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.534925 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.547975 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-m56p6"] Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.589243 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6m8bl"] Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.591118 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.593181 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.603568 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6m8bl"] Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.618117 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.618321 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.618366 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-config\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.618392 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvrvk\" (UniqueName: \"kubernetes.io/projected/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-kube-api-access-dvrvk\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.719970 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrmv\" (UniqueName: \"kubernetes.io/projected/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-kube-api-access-ngrmv\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720026 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-combined-ca-bundle\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720066 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720136 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720156 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-ovs-rundir\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720171 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720200 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-ovn-rundir\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720219 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-config\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720254 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720283 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-config\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.720305 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrvk\" (UniqueName: \"kubernetes.io/projected/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-kube-api-access-dvrvk\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.722216 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: E0310 14:22:11.722321 4911 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 14:22:11 crc kubenswrapper[4911]: E0310 14:22:11.722336 4911 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 14:22:11 crc kubenswrapper[4911]: E0310 14:22:11.722373 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift podName:2ca33eab-721d-4858-8e23-9ffc6371926f nodeName:}" failed. No retries permitted until 2026-03-10 14:22:15.722360472 +0000 UTC m=+1240.285880389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift") pod "swift-storage-0" (UID: "2ca33eab-721d-4858-8e23-9ffc6371926f") : configmap "swift-ring-files" not found Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.723880 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-config\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.723898 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.745447 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrvk\" (UniqueName: \"kubernetes.io/projected/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-kube-api-access-dvrvk\") pod \"dnsmasq-dns-74f6f696b9-m56p6\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.822201 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-combined-ca-bundle\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.822367 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-ovs-rundir\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.822387 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.822791 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-ovs-rundir\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.822854 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-ovn-rundir\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.822877 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-config\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.823202 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-ovn-rundir\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.823363 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrmv\" (UniqueName: \"kubernetes.io/projected/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-kube-api-access-ngrmv\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.823784 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-config\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.826168 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.828538 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-combined-ca-bundle\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.845529 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrmv\" (UniqueName: \"kubernetes.io/projected/1c9db65b-8d56-4b07-86cd-dc73f3aa87fe-kube-api-access-ngrmv\") pod \"ovn-controller-metrics-6m8bl\" (UID: \"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe\") " pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.857415 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:11 crc kubenswrapper[4911]: I0310 14:22:11.923449 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6m8bl" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.005279 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-n48zq"] Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.043352 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-5xq5d"] Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.044904 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.048557 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.064832 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5xq5d"] Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.133489 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.134225 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-config\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.134368 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95t9z\" (UniqueName: \"kubernetes.io/projected/5b96432c-00a6-4109-ad5a-5e81eedb4611-kube-api-access-95t9z\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.134426 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.134469 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-dns-svc\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.158425 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.237958 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.238044 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-dns-svc\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.238115 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.238240 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-config\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.238479 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95t9z\" (UniqueName: \"kubernetes.io/projected/5b96432c-00a6-4109-ad5a-5e81eedb4611-kube-api-access-95t9z\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.239128 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.239142 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-dns-svc\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.240150 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-config\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.240503 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.279703 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95t9z\" (UniqueName: \"kubernetes.io/projected/5b96432c-00a6-4109-ad5a-5e81eedb4611-kube-api-access-95t9z\") pod \"dnsmasq-dns-698758b865-5xq5d\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.326983 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.329050 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.332259 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.332423 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.332442 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.332576 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-j82sx" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.366519 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.370643 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.442198 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/43711b1d-4425-4081-ad98-ecee8b8c73c7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.442254 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.447937 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v49k\" (UniqueName: \"kubernetes.io/projected/43711b1d-4425-4081-ad98-ecee8b8c73c7-kube-api-access-2v49k\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.447985 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.448014 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43711b1d-4425-4081-ad98-ecee8b8c73c7-config\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.448077 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.451002 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43711b1d-4425-4081-ad98-ecee8b8c73c7-scripts\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.553856 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v49k\" (UniqueName: \"kubernetes.io/projected/43711b1d-4425-4081-ad98-ecee8b8c73c7-kube-api-access-2v49k\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.553908 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.553936 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43711b1d-4425-4081-ad98-ecee8b8c73c7-config\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.553973 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.554032 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43711b1d-4425-4081-ad98-ecee8b8c73c7-scripts\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.554078 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/43711b1d-4425-4081-ad98-ecee8b8c73c7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.554098 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.555723 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43711b1d-4425-4081-ad98-ecee8b8c73c7-config\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.555847 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43711b1d-4425-4081-ad98-ecee8b8c73c7-scripts\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.555877 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/43711b1d-4425-4081-ad98-ecee8b8c73c7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.559108 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.559611 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.560388 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43711b1d-4425-4081-ad98-ecee8b8c73c7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.577486 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v49k\" (UniqueName: \"kubernetes.io/projected/43711b1d-4425-4081-ad98-ecee8b8c73c7-kube-api-access-2v49k\") pod \"ovn-northd-0\" (UID: \"43711b1d-4425-4081-ad98-ecee8b8c73c7\") " pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.655339 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.716142 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.716255 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.824387 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.973516 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kgtcj"] Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.975015 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:12 crc kubenswrapper[4911]: I0310 14:22:12.978525 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.004673 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kgtcj"] Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.068877 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-operator-scripts\") pod \"root-account-create-update-kgtcj\" (UID: \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\") " pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.068944 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpv7p\" (UniqueName: \"kubernetes.io/projected/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-kube-api-access-mpv7p\") pod \"root-account-create-update-kgtcj\" (UID: \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\") " pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.124567 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" containerName="dnsmasq-dns" containerID="cri-o://7154bb935c0dacd7db86707576837f2203df2738c1c0b05a4ad50dc86eb4f810" gracePeriod=10 Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.125300 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" podUID="efd1ceba-73de-4071-a519-fd080657ef9d" containerName="dnsmasq-dns" containerID="cri-o://59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93" gracePeriod=10 Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.174674 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-operator-scripts\") pod \"root-account-create-update-kgtcj\" (UID: \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\") " pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.174905 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpv7p\" (UniqueName: \"kubernetes.io/projected/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-kube-api-access-mpv7p\") pod \"root-account-create-update-kgtcj\" (UID: \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\") " pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.176182 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-operator-scripts\") pod \"root-account-create-update-kgtcj\" (UID: \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\") " pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.196613 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpv7p\" (UniqueName: \"kubernetes.io/projected/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-kube-api-access-mpv7p\") pod \"root-account-create-update-kgtcj\" (UID: \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\") " pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.243537 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 14:22:13 crc kubenswrapper[4911]: I0310 14:22:13.303500 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.078250 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.142776 4911 generic.go:334] "Generic (PLEG): container finished" podID="efd1ceba-73de-4071-a519-fd080657ef9d" containerID="59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93" exitCode=0 Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.142869 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" event={"ID":"efd1ceba-73de-4071-a519-fd080657ef9d","Type":"ContainerDied","Data":"59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93"} Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.142924 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" event={"ID":"efd1ceba-73de-4071-a519-fd080657ef9d","Type":"ContainerDied","Data":"d8708ec781ec1aad367c505088173ad10a832b925d9bb79498274f727d8c3660"} Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.142949 4911 scope.go:117] "RemoveContainer" containerID="59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.143192 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-n48zq" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.148978 4911 generic.go:334] "Generic (PLEG): container finished" podID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" containerID="7154bb935c0dacd7db86707576837f2203df2738c1c0b05a4ad50dc86eb4f810" exitCode=0 Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.149037 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" event={"ID":"d536338c-ff72-4f8c-99c6-59b3feba5cf8","Type":"ContainerDied","Data":"7154bb935c0dacd7db86707576837f2203df2738c1c0b05a4ad50dc86eb4f810"} Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.188076 4911 scope.go:117] "RemoveContainer" containerID="f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.196263 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-config\") pod \"efd1ceba-73de-4071-a519-fd080657ef9d\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.196378 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlk9h\" (UniqueName: \"kubernetes.io/projected/efd1ceba-73de-4071-a519-fd080657ef9d-kube-api-access-jlk9h\") pod \"efd1ceba-73de-4071-a519-fd080657ef9d\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.196533 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-dns-svc\") pod \"efd1ceba-73de-4071-a519-fd080657ef9d\" (UID: \"efd1ceba-73de-4071-a519-fd080657ef9d\") " Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.205153 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd1ceba-73de-4071-a519-fd080657ef9d-kube-api-access-jlk9h" (OuterVolumeSpecName: "kube-api-access-jlk9h") pod "efd1ceba-73de-4071-a519-fd080657ef9d" (UID: "efd1ceba-73de-4071-a519-fd080657ef9d"). InnerVolumeSpecName "kube-api-access-jlk9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.248450 4911 scope.go:117] "RemoveContainer" containerID="59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93" Mar 10 14:22:14 crc kubenswrapper[4911]: E0310 14:22:14.251430 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93\": container with ID starting with 59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93 not found: ID does not exist" containerID="59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.251479 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93"} err="failed to get container status \"59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93\": rpc error: code = NotFound desc = could not find container \"59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93\": container with ID starting with 59e365f15e23d8cc85f441260a10cdc3090da1fbfa7b2d19eb599cbc1b06ba93 not found: ID does not exist" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.251532 4911 scope.go:117] "RemoveContainer" containerID="f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619" Mar 10 14:22:14 crc kubenswrapper[4911]: E0310 14:22:14.251868 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619\": container with ID starting with f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619 not found: ID does not exist" containerID="f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.251922 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619"} err="failed to get container status \"f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619\": rpc error: code = NotFound desc = could not find container \"f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619\": container with ID starting with f96e3402d7735164cdfeccbd4d831751a55d4f9e049a8456e04d7494f6daf619 not found: ID does not exist" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.268900 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-config" (OuterVolumeSpecName: "config") pod "efd1ceba-73de-4071-a519-fd080657ef9d" (UID: "efd1ceba-73de-4071-a519-fd080657ef9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.298341 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "efd1ceba-73de-4071-a519-fd080657ef9d" (UID: "efd1ceba-73de-4071-a519-fd080657ef9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.306417 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.306458 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd1ceba-73de-4071-a519-fd080657ef9d-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.306468 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlk9h\" (UniqueName: \"kubernetes.io/projected/efd1ceba-73de-4071-a519-fd080657ef9d-kube-api-access-jlk9h\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.377518 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.492619 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-n48zq"] Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.500362 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-n48zq"] Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.511392 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv4mt\" (UniqueName: \"kubernetes.io/projected/d536338c-ff72-4f8c-99c6-59b3feba5cf8-kube-api-access-gv4mt\") pod \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.511574 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-config\") pod \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.511612 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-dns-svc\") pod \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\" (UID: \"d536338c-ff72-4f8c-99c6-59b3feba5cf8\") " Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.517666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d536338c-ff72-4f8c-99c6-59b3feba5cf8-kube-api-access-gv4mt" (OuterVolumeSpecName: "kube-api-access-gv4mt") pod "d536338c-ff72-4f8c-99c6-59b3feba5cf8" (UID: "d536338c-ff72-4f8c-99c6-59b3feba5cf8"). InnerVolumeSpecName "kube-api-access-gv4mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.557017 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-config" (OuterVolumeSpecName: "config") pod "d536338c-ff72-4f8c-99c6-59b3feba5cf8" (UID: "d536338c-ff72-4f8c-99c6-59b3feba5cf8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.564499 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d536338c-ff72-4f8c-99c6-59b3feba5cf8" (UID: "d536338c-ff72-4f8c-99c6-59b3feba5cf8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.580827 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-m56p6"] Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.593818 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.608647 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kgtcj"] Mar 10 14:22:14 crc kubenswrapper[4911]: W0310 14:22:14.608693 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c625c5_1f3d_4f46_953b_dccee1d15d0e.slice/crio-5b01766528d7864a8858aea2de722c2e1cb917a6dd7cc2ab5f59f1477c76ad96 WatchSource:0}: Error finding container 5b01766528d7864a8858aea2de722c2e1cb917a6dd7cc2ab5f59f1477c76ad96: Status 404 returned error can't find the container with id 5b01766528d7864a8858aea2de722c2e1cb917a6dd7cc2ab5f59f1477c76ad96 Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.614603 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.614642 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536338c-ff72-4f8c-99c6-59b3feba5cf8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.614883 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv4mt\" (UniqueName: \"kubernetes.io/projected/d536338c-ff72-4f8c-99c6-59b3feba5cf8-kube-api-access-gv4mt\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.628907 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5xq5d"] Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.638156 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6m8bl"] Mar 10 14:22:14 crc kubenswrapper[4911]: W0310 14:22:14.643269 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c9db65b_8d56_4b07_86cd_dc73f3aa87fe.slice/crio-86c3a5e8f9f5100aa4a2cfc350a94d063401980e8f3373e2c8b736f72c0ccca9 WatchSource:0}: Error finding container 86c3a5e8f9f5100aa4a2cfc350a94d063401980e8f3373e2c8b736f72c0ccca9: Status 404 returned error can't find the container with id 86c3a5e8f9f5100aa4a2cfc350a94d063401980e8f3373e2c8b736f72c0ccca9 Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.746331 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nk2c4"] Mar 10 14:22:14 crc kubenswrapper[4911]: E0310 14:22:14.746823 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" containerName="init" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.746848 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" containerName="init" Mar 10 14:22:14 crc kubenswrapper[4911]: E0310 14:22:14.746864 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" containerName="dnsmasq-dns" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.746871 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" containerName="dnsmasq-dns" Mar 10 14:22:14 crc kubenswrapper[4911]: E0310 14:22:14.746885 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd1ceba-73de-4071-a519-fd080657ef9d" containerName="dnsmasq-dns" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.746897 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd1ceba-73de-4071-a519-fd080657ef9d" containerName="dnsmasq-dns" Mar 10 14:22:14 crc kubenswrapper[4911]: E0310 14:22:14.746925 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd1ceba-73de-4071-a519-fd080657ef9d" containerName="init" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.746931 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd1ceba-73de-4071-a519-fd080657ef9d" containerName="init" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.747107 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd1ceba-73de-4071-a519-fd080657ef9d" containerName="dnsmasq-dns" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.747130 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" containerName="dnsmasq-dns" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.747822 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.767569 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7df8-account-create-update-d9npt"] Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.769089 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.774558 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nk2c4"] Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.816659 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.843684 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7df8-account-create-update-d9npt"] Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.923703 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-operator-scripts\") pod \"glance-db-create-nk2c4\" (UID: \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\") " pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.924471 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7629\" (UniqueName: \"kubernetes.io/projected/16f1f748-2a8f-4786-907a-6800ad1c999c-kube-api-access-l7629\") pod \"glance-7df8-account-create-update-d9npt\" (UID: \"16f1f748-2a8f-4786-907a-6800ad1c999c\") " pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.924615 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv2bd\" (UniqueName: \"kubernetes.io/projected/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-kube-api-access-mv2bd\") pod \"glance-db-create-nk2c4\" (UID: \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\") " pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:14 crc kubenswrapper[4911]: I0310 14:22:14.924680 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f1f748-2a8f-4786-907a-6800ad1c999c-operator-scripts\") pod \"glance-7df8-account-create-update-d9npt\" (UID: \"16f1f748-2a8f-4786-907a-6800ad1c999c\") " pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.026049 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-operator-scripts\") pod \"glance-db-create-nk2c4\" (UID: \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\") " pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.026468 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7629\" (UniqueName: \"kubernetes.io/projected/16f1f748-2a8f-4786-907a-6800ad1c999c-kube-api-access-l7629\") pod \"glance-7df8-account-create-update-d9npt\" (UID: \"16f1f748-2a8f-4786-907a-6800ad1c999c\") " pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.026588 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv2bd\" (UniqueName: \"kubernetes.io/projected/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-kube-api-access-mv2bd\") pod \"glance-db-create-nk2c4\" (UID: \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\") " pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.026698 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f1f748-2a8f-4786-907a-6800ad1c999c-operator-scripts\") pod \"glance-7df8-account-create-update-d9npt\" (UID: \"16f1f748-2a8f-4786-907a-6800ad1c999c\") " pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.027178 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-operator-scripts\") pod \"glance-db-create-nk2c4\" (UID: \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\") " pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.030419 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f1f748-2a8f-4786-907a-6800ad1c999c-operator-scripts\") pod \"glance-7df8-account-create-update-d9npt\" (UID: \"16f1f748-2a8f-4786-907a-6800ad1c999c\") " pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.046621 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv2bd\" (UniqueName: \"kubernetes.io/projected/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-kube-api-access-mv2bd\") pod \"glance-db-create-nk2c4\" (UID: \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\") " pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.056464 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7629\" (UniqueName: \"kubernetes.io/projected/16f1f748-2a8f-4786-907a-6800ad1c999c-kube-api-access-l7629\") pod \"glance-7df8-account-create-update-d9npt\" (UID: \"16f1f748-2a8f-4786-907a-6800ad1c999c\") " pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.137898 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.159427 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5xq5d" event={"ID":"5b96432c-00a6-4109-ad5a-5e81eedb4611","Type":"ContainerStarted","Data":"6320da9ff5e4181b78b3062ca71e9dcbc4cc06374c60a57fa28baa6e615942c1"} Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.162758 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" event={"ID":"e7c625c5-1f3d-4f46-953b-dccee1d15d0e","Type":"ContainerStarted","Data":"5b01766528d7864a8858aea2de722c2e1cb917a6dd7cc2ab5f59f1477c76ad96"} Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.163900 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"43711b1d-4425-4081-ad98-ecee8b8c73c7","Type":"ContainerStarted","Data":"606f996a71a579d5f9075074e4640058b00e5329b0b325c64d1ce17f750bc695"} Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.165792 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kgtcj" event={"ID":"4ce6d89e-51f0-4222-96d1-2d39f09b0e35","Type":"ContainerStarted","Data":"479f239be861604ac92a855158ef4193e40b59289665180975d6c665615ca352"} Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.175091 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.175136 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hkqgc" event={"ID":"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a","Type":"ContainerStarted","Data":"7d870905f885f9b5f096df151789c78e888b9e4738b3400aeef9bde6566506ce"} Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.182779 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6m8bl" event={"ID":"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe","Type":"ContainerStarted","Data":"86c3a5e8f9f5100aa4a2cfc350a94d063401980e8f3373e2c8b736f72c0ccca9"} Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.208336 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.209297 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-47sff"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.210890 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5xdw" event={"ID":"d536338c-ff72-4f8c-99c6-59b3feba5cf8","Type":"ContainerDied","Data":"ad8398624c564a2eea18748fa7ddcb741e8302f723392ea315383b43ea6986e3"} Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.210937 4911 scope.go:117] "RemoveContainer" containerID="7154bb935c0dacd7db86707576837f2203df2738c1c0b05a4ad50dc86eb4f810" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.211048 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-47sff" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.211554 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-hkqgc" podStartSLOduration=2.725254033 podStartE2EDuration="7.211541497s" podCreationTimestamp="2026-03-10 14:22:08 +0000 UTC" firstStartedPulling="2026-03-10 14:22:09.329172474 +0000 UTC m=+1233.892692391" lastFinishedPulling="2026-03-10 14:22:13.815459938 +0000 UTC m=+1238.378979855" observedRunningTime="2026-03-10 14:22:15.202705647 +0000 UTC m=+1239.766225564" watchObservedRunningTime="2026-03-10 14:22:15.211541497 +0000 UTC m=+1239.775061414" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.235268 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9mc9\" (UniqueName: \"kubernetes.io/projected/71f3b5bd-f962-4c8b-a792-01badc31a9d4-kube-api-access-z9mc9\") pod \"keystone-db-create-47sff\" (UID: \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\") " pod="openstack/keystone-db-create-47sff" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.235891 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f3b5bd-f962-4c8b-a792-01badc31a9d4-operator-scripts\") pod \"keystone-db-create-47sff\" (UID: \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\") " pod="openstack/keystone-db-create-47sff" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.254395 4911 scope.go:117] "RemoveContainer" containerID="64a0ced6ce02cdc337613105a305d24e24da976dfa9c6923d14c1e235ec681eb" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.264888 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-47sff"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.339930 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f3b5bd-f962-4c8b-a792-01badc31a9d4-operator-scripts\") pod \"keystone-db-create-47sff\" (UID: \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\") " pod="openstack/keystone-db-create-47sff" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.340083 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9mc9\" (UniqueName: \"kubernetes.io/projected/71f3b5bd-f962-4c8b-a792-01badc31a9d4-kube-api-access-z9mc9\") pod \"keystone-db-create-47sff\" (UID: \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\") " pod="openstack/keystone-db-create-47sff" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.341591 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f3b5bd-f962-4c8b-a792-01badc31a9d4-operator-scripts\") pod \"keystone-db-create-47sff\" (UID: \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\") " pod="openstack/keystone-db-create-47sff" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.366425 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5xdw"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.369803 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9mc9\" (UniqueName: \"kubernetes.io/projected/71f3b5bd-f962-4c8b-a792-01badc31a9d4-kube-api-access-z9mc9\") pod \"keystone-db-create-47sff\" (UID: \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\") " pod="openstack/keystone-db-create-47sff" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.379133 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5xdw"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.389601 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ac1c-account-create-update-2vkc9"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.392092 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.395231 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.398163 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ac1c-account-create-update-2vkc9"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.544403 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89d2b\" (UniqueName: \"kubernetes.io/projected/14a50823-d870-43f3-8599-ac8cf13ce7c8-kube-api-access-89d2b\") pod \"keystone-ac1c-account-create-update-2vkc9\" (UID: \"14a50823-d870-43f3-8599-ac8cf13ce7c8\") " pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.544590 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a50823-d870-43f3-8599-ac8cf13ce7c8-operator-scripts\") pod \"keystone-ac1c-account-create-update-2vkc9\" (UID: \"14a50823-d870-43f3-8599-ac8cf13ce7c8\") " pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.545684 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6rbps"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.546943 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6rbps" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.556328 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-47sff" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.576175 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6rbps"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.650680 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a50823-d870-43f3-8599-ac8cf13ce7c8-operator-scripts\") pod \"keystone-ac1c-account-create-update-2vkc9\" (UID: \"14a50823-d870-43f3-8599-ac8cf13ce7c8\") " pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.650746 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dsgh\" (UniqueName: \"kubernetes.io/projected/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-kube-api-access-7dsgh\") pod \"placement-db-create-6rbps\" (UID: \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\") " pod="openstack/placement-db-create-6rbps" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.650876 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-operator-scripts\") pod \"placement-db-create-6rbps\" (UID: \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\") " pod="openstack/placement-db-create-6rbps" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.650906 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89d2b\" (UniqueName: \"kubernetes.io/projected/14a50823-d870-43f3-8599-ac8cf13ce7c8-kube-api-access-89d2b\") pod \"keystone-ac1c-account-create-update-2vkc9\" (UID: \"14a50823-d870-43f3-8599-ac8cf13ce7c8\") " pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.652207 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a50823-d870-43f3-8599-ac8cf13ce7c8-operator-scripts\") pod \"keystone-ac1c-account-create-update-2vkc9\" (UID: \"14a50823-d870-43f3-8599-ac8cf13ce7c8\") " pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.654478 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b41f-account-create-update-h7v89"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.667097 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.670578 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89d2b\" (UniqueName: \"kubernetes.io/projected/14a50823-d870-43f3-8599-ac8cf13ce7c8-kube-api-access-89d2b\") pod \"keystone-ac1c-account-create-update-2vkc9\" (UID: \"14a50823-d870-43f3-8599-ac8cf13ce7c8\") " pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.672414 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b41f-account-create-update-h7v89"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.672627 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.726255 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nk2c4"] Mar 10 14:22:15 crc kubenswrapper[4911]: W0310 14:22:15.731148 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8905bd52_ff68_4e7c_818f_bf2a38b80b8f.slice/crio-24f98c23eec46054a31010a0ef8032f6109c90fbf46f1f0b176d0c6172925452 WatchSource:0}: Error finding container 24f98c23eec46054a31010a0ef8032f6109c90fbf46f1f0b176d0c6172925452: Status 404 returned error can't find the container with id 24f98c23eec46054a31010a0ef8032f6109c90fbf46f1f0b176d0c6172925452 Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.753761 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16670393-24c4-4df7-b240-0509b1866a1a-operator-scripts\") pod \"placement-b41f-account-create-update-h7v89\" (UID: \"16670393-24c4-4df7-b240-0509b1866a1a\") " pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.753828 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp5m5\" (UniqueName: \"kubernetes.io/projected/16670393-24c4-4df7-b240-0509b1866a1a-kube-api-access-cp5m5\") pod \"placement-b41f-account-create-update-h7v89\" (UID: \"16670393-24c4-4df7-b240-0509b1866a1a\") " pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.753865 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-operator-scripts\") pod \"placement-db-create-6rbps\" (UID: \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\") " pod="openstack/placement-db-create-6rbps" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.753934 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dsgh\" (UniqueName: \"kubernetes.io/projected/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-kube-api-access-7dsgh\") pod \"placement-db-create-6rbps\" (UID: \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\") " pod="openstack/placement-db-create-6rbps" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.753979 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:15 crc kubenswrapper[4911]: E0310 14:22:15.754161 4911 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 14:22:15 crc kubenswrapper[4911]: E0310 14:22:15.754176 4911 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 14:22:15 crc kubenswrapper[4911]: E0310 14:22:15.754248 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift podName:2ca33eab-721d-4858-8e23-9ffc6371926f nodeName:}" failed. No retries permitted until 2026-03-10 14:22:23.754227973 +0000 UTC m=+1248.317747890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift") pod "swift-storage-0" (UID: "2ca33eab-721d-4858-8e23-9ffc6371926f") : configmap "swift-ring-files" not found Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.756158 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-operator-scripts\") pod \"placement-db-create-6rbps\" (UID: \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\") " pod="openstack/placement-db-create-6rbps" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.775668 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dsgh\" (UniqueName: \"kubernetes.io/projected/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-kube-api-access-7dsgh\") pod \"placement-db-create-6rbps\" (UID: \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\") " pod="openstack/placement-db-create-6rbps" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.783144 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.830306 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7df8-account-create-update-d9npt"] Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.856080 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16670393-24c4-4df7-b240-0509b1866a1a-operator-scripts\") pod \"placement-b41f-account-create-update-h7v89\" (UID: \"16670393-24c4-4df7-b240-0509b1866a1a\") " pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.856131 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp5m5\" (UniqueName: \"kubernetes.io/projected/16670393-24c4-4df7-b240-0509b1866a1a-kube-api-access-cp5m5\") pod \"placement-b41f-account-create-update-h7v89\" (UID: \"16670393-24c4-4df7-b240-0509b1866a1a\") " pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.860012 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16670393-24c4-4df7-b240-0509b1866a1a-operator-scripts\") pod \"placement-b41f-account-create-update-h7v89\" (UID: \"16670393-24c4-4df7-b240-0509b1866a1a\") " pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.894429 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6rbps" Mar 10 14:22:15 crc kubenswrapper[4911]: I0310 14:22:15.894522 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp5m5\" (UniqueName: \"kubernetes.io/projected/16670393-24c4-4df7-b240-0509b1866a1a-kube-api-access-cp5m5\") pod \"placement-b41f-account-create-update-h7v89\" (UID: \"16670393-24c4-4df7-b240-0509b1866a1a\") " pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.029536 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.068180 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ac1c-account-create-update-2vkc9"] Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.077498 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-47sff"] Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.209395 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d536338c-ff72-4f8c-99c6-59b3feba5cf8" path="/var/lib/kubelet/pods/d536338c-ff72-4f8c-99c6-59b3feba5cf8/volumes" Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.210211 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd1ceba-73de-4071-a519-fd080657ef9d" path="/var/lib/kubelet/pods/efd1ceba-73de-4071-a519-fd080657ef9d/volumes" Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.221095 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nk2c4" event={"ID":"8905bd52-ff68-4e7c-818f-bf2a38b80b8f","Type":"ContainerStarted","Data":"654710fd98e60b38b77a58e75ae5052b12e196850bf1051c9beecb856314b843"} Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.221151 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nk2c4" event={"ID":"8905bd52-ff68-4e7c-818f-bf2a38b80b8f","Type":"ContainerStarted","Data":"24f98c23eec46054a31010a0ef8032f6109c90fbf46f1f0b176d0c6172925452"} Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.233250 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6m8bl" event={"ID":"1c9db65b-8d56-4b07-86cd-dc73f3aa87fe","Type":"ContainerStarted","Data":"2bfee1b06011d145a0dbc748da3dfcf61a049cced1c6dac2f4760e8d82eee2db"} Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.239442 4911 generic.go:334] "Generic (PLEG): container finished" podID="5b96432c-00a6-4109-ad5a-5e81eedb4611" containerID="5ad281a7a8c2fb6cf9f522fb9d5879cceccde5d1bc94a6b2335256689af30165" exitCode=0 Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.239515 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5xq5d" event={"ID":"5b96432c-00a6-4109-ad5a-5e81eedb4611","Type":"ContainerDied","Data":"5ad281a7a8c2fb6cf9f522fb9d5879cceccde5d1bc94a6b2335256689af30165"} Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.241573 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7df8-account-create-update-d9npt" event={"ID":"16f1f748-2a8f-4786-907a-6800ad1c999c","Type":"ContainerStarted","Data":"1373d6351cb965a4a3cceec35a025110e2796cc36818ab19142c65a000244b44"} Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.241626 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7df8-account-create-update-d9npt" event={"ID":"16f1f748-2a8f-4786-907a-6800ad1c999c","Type":"ContainerStarted","Data":"0bc8597a086f5e19c563380a9058ce9cfd50ac15995db2946d8098628781cab5"} Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.252847 4911 generic.go:334] "Generic (PLEG): container finished" podID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerID="aaa7ff7f4614508efee5305acb9e2bcdef1c856851cd8fb1046ee40e84d87fef" exitCode=0 Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.253041 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" event={"ID":"e7c625c5-1f3d-4f46-953b-dccee1d15d0e","Type":"ContainerDied","Data":"aaa7ff7f4614508efee5305acb9e2bcdef1c856851cd8fb1046ee40e84d87fef"} Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.255886 4911 generic.go:334] "Generic (PLEG): container finished" podID="4ce6d89e-51f0-4222-96d1-2d39f09b0e35" containerID="f495a1455781098ed5d3ce1cc62156705490a948b0e4ac25a4fd8e17b20971da" exitCode=0 Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.256023 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kgtcj" event={"ID":"4ce6d89e-51f0-4222-96d1-2d39f09b0e35","Type":"ContainerDied","Data":"f495a1455781098ed5d3ce1cc62156705490a948b0e4ac25a4fd8e17b20971da"} Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.413849 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-nk2c4" podStartSLOduration=2.413764547 podStartE2EDuration="2.413764547s" podCreationTimestamp="2026-03-10 14:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:16.402038034 +0000 UTC m=+1240.965557981" watchObservedRunningTime="2026-03-10 14:22:16.413764547 +0000 UTC m=+1240.977284464" Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.488662 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6rbps"] Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.514500 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6m8bl" podStartSLOduration=5.514471801 podStartE2EDuration="5.514471801s" podCreationTimestamp="2026-03-10 14:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:16.447441097 +0000 UTC m=+1241.010961014" watchObservedRunningTime="2026-03-10 14:22:16.514471801 +0000 UTC m=+1241.077991718" Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.526409 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-7df8-account-create-update-d9npt" podStartSLOduration=2.526386188 podStartE2EDuration="2.526386188s" podCreationTimestamp="2026-03-10 14:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:16.488564144 +0000 UTC m=+1241.052084061" watchObservedRunningTime="2026-03-10 14:22:16.526386188 +0000 UTC m=+1241.089906105" Mar 10 14:22:16 crc kubenswrapper[4911]: I0310 14:22:16.701347 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.271789 4911 generic.go:334] "Generic (PLEG): container finished" podID="16f1f748-2a8f-4786-907a-6800ad1c999c" containerID="1373d6351cb965a4a3cceec35a025110e2796cc36818ab19142c65a000244b44" exitCode=0 Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.271857 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7df8-account-create-update-d9npt" event={"ID":"16f1f748-2a8f-4786-907a-6800ad1c999c","Type":"ContainerDied","Data":"1373d6351cb965a4a3cceec35a025110e2796cc36818ab19142c65a000244b44"} Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.277540 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-47sff" event={"ID":"71f3b5bd-f962-4c8b-a792-01badc31a9d4","Type":"ContainerStarted","Data":"310148cba69c6b994a360eee0633e4f63707f3b908843705b5f37f3065682555"} Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.280705 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6rbps" event={"ID":"7bda7dfa-e3a9-4f11-9872-b83c855f7df6","Type":"ContainerStarted","Data":"0b4ecb4cf7e78cc1cc1969e9f0dfeaf531e027a48b9ac86c95a80b19b3fceecf"} Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.291008 4911 generic.go:334] "Generic (PLEG): container finished" podID="8905bd52-ff68-4e7c-818f-bf2a38b80b8f" containerID="654710fd98e60b38b77a58e75ae5052b12e196850bf1051c9beecb856314b843" exitCode=0 Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.291186 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nk2c4" event={"ID":"8905bd52-ff68-4e7c-818f-bf2a38b80b8f","Type":"ContainerDied","Data":"654710fd98e60b38b77a58e75ae5052b12e196850bf1051c9beecb856314b843"} Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.294181 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ac1c-account-create-update-2vkc9" event={"ID":"14a50823-d870-43f3-8599-ac8cf13ce7c8","Type":"ContainerStarted","Data":"af3078a57f9ce6f5fa1dbe62a1651a0ec5050d34bed4021c22619f94fd24f094"} Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.350071 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b41f-account-create-update-h7v89"] Mar 10 14:22:17 crc kubenswrapper[4911]: W0310 14:22:17.408701 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16670393_24c4_4df7_b240_0509b1866a1a.slice/crio-f715d18c35a0ad20b0ad882425d20582c23d5e94f5b074fc07f83a95968450b2 WatchSource:0}: Error finding container f715d18c35a0ad20b0ad882425d20582c23d5e94f5b074fc07f83a95968450b2: Status 404 returned error can't find the container with id f715d18c35a0ad20b0ad882425d20582c23d5e94f5b074fc07f83a95968450b2 Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.736607 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.839141 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-operator-scripts\") pod \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\" (UID: \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\") " Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.839646 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpv7p\" (UniqueName: \"kubernetes.io/projected/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-kube-api-access-mpv7p\") pod \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\" (UID: \"4ce6d89e-51f0-4222-96d1-2d39f09b0e35\") " Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.841555 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ce6d89e-51f0-4222-96d1-2d39f09b0e35" (UID: "4ce6d89e-51f0-4222-96d1-2d39f09b0e35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.851699 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-kube-api-access-mpv7p" (OuterVolumeSpecName: "kube-api-access-mpv7p") pod "4ce6d89e-51f0-4222-96d1-2d39f09b0e35" (UID: "4ce6d89e-51f0-4222-96d1-2d39f09b0e35"). InnerVolumeSpecName "kube-api-access-mpv7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.941056 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:17 crc kubenswrapper[4911]: I0310 14:22:17.941103 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpv7p\" (UniqueName: \"kubernetes.io/projected/4ce6d89e-51f0-4222-96d1-2d39f09b0e35-kube-api-access-mpv7p\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.307824 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"43711b1d-4425-4081-ad98-ecee8b8c73c7","Type":"ContainerStarted","Data":"ad97fd04878c2e666a5d741ee3f6e9d9b283ad23dd4ca2acf14400eb243e73cd"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.307887 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"43711b1d-4425-4081-ad98-ecee8b8c73c7","Type":"ContainerStarted","Data":"75daa9246aecbf1219c2995e75c7a3de9f0f31f90fc9b9dee75bf7d4739581bb"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.308019 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.309826 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kgtcj" event={"ID":"4ce6d89e-51f0-4222-96d1-2d39f09b0e35","Type":"ContainerDied","Data":"479f239be861604ac92a855158ef4193e40b59289665180975d6c665615ca352"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.309859 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="479f239be861604ac92a855158ef4193e40b59289665180975d6c665615ca352" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.309928 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kgtcj" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.311958 4911 generic.go:334] "Generic (PLEG): container finished" podID="16670393-24c4-4df7-b240-0509b1866a1a" containerID="3239adfda8fcd0f880275aa7f28ea5f8a24c039c002aa9def32a7ee99f37b24b" exitCode=0 Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.312088 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b41f-account-create-update-h7v89" event={"ID":"16670393-24c4-4df7-b240-0509b1866a1a","Type":"ContainerDied","Data":"3239adfda8fcd0f880275aa7f28ea5f8a24c039c002aa9def32a7ee99f37b24b"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.312133 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b41f-account-create-update-h7v89" event={"ID":"16670393-24c4-4df7-b240-0509b1866a1a","Type":"ContainerStarted","Data":"f715d18c35a0ad20b0ad882425d20582c23d5e94f5b074fc07f83a95968450b2"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.314860 4911 generic.go:334] "Generic (PLEG): container finished" podID="7bda7dfa-e3a9-4f11-9872-b83c855f7df6" containerID="35d01e8c663507a7d71b55c9602d82e99b54f742c413bd58ad0542bd00b36ea5" exitCode=0 Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.314949 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6rbps" event={"ID":"7bda7dfa-e3a9-4f11-9872-b83c855f7df6","Type":"ContainerDied","Data":"35d01e8c663507a7d71b55c9602d82e99b54f742c413bd58ad0542bd00b36ea5"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.316839 4911 generic.go:334] "Generic (PLEG): container finished" podID="14a50823-d870-43f3-8599-ac8cf13ce7c8" containerID="7c95aadb6d53bb8b916f8fe848a8a68b6048ba1a8a92df065a960f73dcd7ddad" exitCode=0 Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.316876 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ac1c-account-create-update-2vkc9" event={"ID":"14a50823-d870-43f3-8599-ac8cf13ce7c8","Type":"ContainerDied","Data":"7c95aadb6d53bb8b916f8fe848a8a68b6048ba1a8a92df065a960f73dcd7ddad"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.319487 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5xq5d" event={"ID":"5b96432c-00a6-4109-ad5a-5e81eedb4611","Type":"ContainerStarted","Data":"088ce3e316098d69cf9687cae329d7b74b1d202795beb964670796cdb096419f"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.319633 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.321322 4911 generic.go:334] "Generic (PLEG): container finished" podID="71f3b5bd-f962-4c8b-a792-01badc31a9d4" containerID="ef23ba72f589d96b300db6d4d42e933c2309b6c7e54564487471dc7ac93699f1" exitCode=0 Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.321379 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-47sff" event={"ID":"71f3b5bd-f962-4c8b-a792-01badc31a9d4","Type":"ContainerDied","Data":"ef23ba72f589d96b300db6d4d42e933c2309b6c7e54564487471dc7ac93699f1"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.323754 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" event={"ID":"e7c625c5-1f3d-4f46-953b-dccee1d15d0e","Type":"ContainerStarted","Data":"f1a625682d9007d83f3897de06ca8fbe0bf81593321b4a0d7bb52c0fd3aa6d15"} Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.363634 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" podStartSLOduration=7.363598067 podStartE2EDuration="7.363598067s" podCreationTimestamp="2026-03-10 14:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:18.360776817 +0000 UTC m=+1242.924296744" watchObservedRunningTime="2026-03-10 14:22:18.363598067 +0000 UTC m=+1242.927117984" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.369944 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.257255189 podStartE2EDuration="6.369909205s" podCreationTimestamp="2026-03-10 14:22:12 +0000 UTC" firstStartedPulling="2026-03-10 14:22:14.620394691 +0000 UTC m=+1239.183914608" lastFinishedPulling="2026-03-10 14:22:16.733048707 +0000 UTC m=+1241.296568624" observedRunningTime="2026-03-10 14:22:18.335756262 +0000 UTC m=+1242.899276199" watchObservedRunningTime="2026-03-10 14:22:18.369909205 +0000 UTC m=+1242.933429122" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.392389 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-5xq5d" podStartSLOduration=6.392365565 podStartE2EDuration="6.392365565s" podCreationTimestamp="2026-03-10 14:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:18.385398171 +0000 UTC m=+1242.948918108" watchObservedRunningTime="2026-03-10 14:22:18.392365565 +0000 UTC m=+1242.955885472" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.815899 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.821697 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.873847 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv2bd\" (UniqueName: \"kubernetes.io/projected/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-kube-api-access-mv2bd\") pod \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\" (UID: \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\") " Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.873928 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-operator-scripts\") pod \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\" (UID: \"8905bd52-ff68-4e7c-818f-bf2a38b80b8f\") " Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.874014 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7629\" (UniqueName: \"kubernetes.io/projected/16f1f748-2a8f-4786-907a-6800ad1c999c-kube-api-access-l7629\") pod \"16f1f748-2a8f-4786-907a-6800ad1c999c\" (UID: \"16f1f748-2a8f-4786-907a-6800ad1c999c\") " Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.874178 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f1f748-2a8f-4786-907a-6800ad1c999c-operator-scripts\") pod \"16f1f748-2a8f-4786-907a-6800ad1c999c\" (UID: \"16f1f748-2a8f-4786-907a-6800ad1c999c\") " Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.875055 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8905bd52-ff68-4e7c-818f-bf2a38b80b8f" (UID: "8905bd52-ff68-4e7c-818f-bf2a38b80b8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.875103 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f1f748-2a8f-4786-907a-6800ad1c999c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16f1f748-2a8f-4786-907a-6800ad1c999c" (UID: "16f1f748-2a8f-4786-907a-6800ad1c999c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.875219 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f1f748-2a8f-4786-907a-6800ad1c999c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.875235 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.882289 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-kube-api-access-mv2bd" (OuterVolumeSpecName: "kube-api-access-mv2bd") pod "8905bd52-ff68-4e7c-818f-bf2a38b80b8f" (UID: "8905bd52-ff68-4e7c-818f-bf2a38b80b8f"). InnerVolumeSpecName "kube-api-access-mv2bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.882391 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f1f748-2a8f-4786-907a-6800ad1c999c-kube-api-access-l7629" (OuterVolumeSpecName: "kube-api-access-l7629") pod "16f1f748-2a8f-4786-907a-6800ad1c999c" (UID: "16f1f748-2a8f-4786-907a-6800ad1c999c"). InnerVolumeSpecName "kube-api-access-l7629". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.976901 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7629\" (UniqueName: \"kubernetes.io/projected/16f1f748-2a8f-4786-907a-6800ad1c999c-kube-api-access-l7629\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:18 crc kubenswrapper[4911]: I0310 14:22:18.976962 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv2bd\" (UniqueName: \"kubernetes.io/projected/8905bd52-ff68-4e7c-818f-bf2a38b80b8f-kube-api-access-mv2bd\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.337830 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7df8-account-create-update-d9npt" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.337845 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7df8-account-create-update-d9npt" event={"ID":"16f1f748-2a8f-4786-907a-6800ad1c999c","Type":"ContainerDied","Data":"0bc8597a086f5e19c563380a9058ce9cfd50ac15995db2946d8098628781cab5"} Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.337923 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc8597a086f5e19c563380a9058ce9cfd50ac15995db2946d8098628781cab5" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.341247 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nk2c4" event={"ID":"8905bd52-ff68-4e7c-818f-bf2a38b80b8f","Type":"ContainerDied","Data":"24f98c23eec46054a31010a0ef8032f6109c90fbf46f1f0b176d0c6172925452"} Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.341311 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24f98c23eec46054a31010a0ef8032f6109c90fbf46f1f0b176d0c6172925452" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.341671 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nk2c4" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.342445 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.702408 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6rbps" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.799143 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-operator-scripts\") pod \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\" (UID: \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\") " Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.799668 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dsgh\" (UniqueName: \"kubernetes.io/projected/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-kube-api-access-7dsgh\") pod \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\" (UID: \"7bda7dfa-e3a9-4f11-9872-b83c855f7df6\") " Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.800466 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bda7dfa-e3a9-4f11-9872-b83c855f7df6" (UID: "7bda7dfa-e3a9-4f11-9872-b83c855f7df6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.804029 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-kube-api-access-7dsgh" (OuterVolumeSpecName: "kube-api-access-7dsgh") pod "7bda7dfa-e3a9-4f11-9872-b83c855f7df6" (UID: "7bda7dfa-e3a9-4f11-9872-b83c855f7df6"). InnerVolumeSpecName "kube-api-access-7dsgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.902255 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dsgh\" (UniqueName: \"kubernetes.io/projected/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-kube-api-access-7dsgh\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.902303 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bda7dfa-e3a9-4f11-9872-b83c855f7df6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.914946 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.921584 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-47sff" Mar 10 14:22:19 crc kubenswrapper[4911]: I0310 14:22:19.945196 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.003905 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16670393-24c4-4df7-b240-0509b1866a1a-operator-scripts\") pod \"16670393-24c4-4df7-b240-0509b1866a1a\" (UID: \"16670393-24c4-4df7-b240-0509b1866a1a\") " Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.003974 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9mc9\" (UniqueName: \"kubernetes.io/projected/71f3b5bd-f962-4c8b-a792-01badc31a9d4-kube-api-access-z9mc9\") pod \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\" (UID: \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\") " Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.004031 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89d2b\" (UniqueName: \"kubernetes.io/projected/14a50823-d870-43f3-8599-ac8cf13ce7c8-kube-api-access-89d2b\") pod \"14a50823-d870-43f3-8599-ac8cf13ce7c8\" (UID: \"14a50823-d870-43f3-8599-ac8cf13ce7c8\") " Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.004066 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp5m5\" (UniqueName: \"kubernetes.io/projected/16670393-24c4-4df7-b240-0509b1866a1a-kube-api-access-cp5m5\") pod \"16670393-24c4-4df7-b240-0509b1866a1a\" (UID: \"16670393-24c4-4df7-b240-0509b1866a1a\") " Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.004159 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a50823-d870-43f3-8599-ac8cf13ce7c8-operator-scripts\") pod \"14a50823-d870-43f3-8599-ac8cf13ce7c8\" (UID: \"14a50823-d870-43f3-8599-ac8cf13ce7c8\") " Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.004195 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f3b5bd-f962-4c8b-a792-01badc31a9d4-operator-scripts\") pod \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\" (UID: \"71f3b5bd-f962-4c8b-a792-01badc31a9d4\") " Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.005136 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f3b5bd-f962-4c8b-a792-01badc31a9d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71f3b5bd-f962-4c8b-a792-01badc31a9d4" (UID: "71f3b5bd-f962-4c8b-a792-01badc31a9d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.005618 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16670393-24c4-4df7-b240-0509b1866a1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16670393-24c4-4df7-b240-0509b1866a1a" (UID: "16670393-24c4-4df7-b240-0509b1866a1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.006217 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a50823-d870-43f3-8599-ac8cf13ce7c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14a50823-d870-43f3-8599-ac8cf13ce7c8" (UID: "14a50823-d870-43f3-8599-ac8cf13ce7c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.009628 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71f3b5bd-f962-4c8b-a792-01badc31a9d4-kube-api-access-z9mc9" (OuterVolumeSpecName: "kube-api-access-z9mc9") pod "71f3b5bd-f962-4c8b-a792-01badc31a9d4" (UID: "71f3b5bd-f962-4c8b-a792-01badc31a9d4"). InnerVolumeSpecName "kube-api-access-z9mc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.027198 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16670393-24c4-4df7-b240-0509b1866a1a-kube-api-access-cp5m5" (OuterVolumeSpecName: "kube-api-access-cp5m5") pod "16670393-24c4-4df7-b240-0509b1866a1a" (UID: "16670393-24c4-4df7-b240-0509b1866a1a"). InnerVolumeSpecName "kube-api-access-cp5m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.027270 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a50823-d870-43f3-8599-ac8cf13ce7c8-kube-api-access-89d2b" (OuterVolumeSpecName: "kube-api-access-89d2b") pod "14a50823-d870-43f3-8599-ac8cf13ce7c8" (UID: "14a50823-d870-43f3-8599-ac8cf13ce7c8"). InnerVolumeSpecName "kube-api-access-89d2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.107646 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89d2b\" (UniqueName: \"kubernetes.io/projected/14a50823-d870-43f3-8599-ac8cf13ce7c8-kube-api-access-89d2b\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.107719 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp5m5\" (UniqueName: \"kubernetes.io/projected/16670393-24c4-4df7-b240-0509b1866a1a-kube-api-access-cp5m5\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.107796 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a50823-d870-43f3-8599-ac8cf13ce7c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.107832 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f3b5bd-f962-4c8b-a792-01badc31a9d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.107859 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16670393-24c4-4df7-b240-0509b1866a1a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.107887 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9mc9\" (UniqueName: \"kubernetes.io/projected/71f3b5bd-f962-4c8b-a792-01badc31a9d4-kube-api-access-z9mc9\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.353947 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-47sff" event={"ID":"71f3b5bd-f962-4c8b-a792-01badc31a9d4","Type":"ContainerDied","Data":"310148cba69c6b994a360eee0633e4f63707f3b908843705b5f37f3065682555"} Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.354001 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="310148cba69c6b994a360eee0633e4f63707f3b908843705b5f37f3065682555" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.354099 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-47sff" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.356380 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b41f-account-create-update-h7v89" event={"ID":"16670393-24c4-4df7-b240-0509b1866a1a","Type":"ContainerDied","Data":"f715d18c35a0ad20b0ad882425d20582c23d5e94f5b074fc07f83a95968450b2"} Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.356415 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f715d18c35a0ad20b0ad882425d20582c23d5e94f5b074fc07f83a95968450b2" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.356539 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b41f-account-create-update-h7v89" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.358172 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6rbps" event={"ID":"7bda7dfa-e3a9-4f11-9872-b83c855f7df6","Type":"ContainerDied","Data":"0b4ecb4cf7e78cc1cc1969e9f0dfeaf531e027a48b9ac86c95a80b19b3fceecf"} Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.358211 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6rbps" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.358213 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b4ecb4cf7e78cc1cc1969e9f0dfeaf531e027a48b9ac86c95a80b19b3fceecf" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.363537 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ac1c-account-create-update-2vkc9" Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.363538 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ac1c-account-create-update-2vkc9" event={"ID":"14a50823-d870-43f3-8599-ac8cf13ce7c8","Type":"ContainerDied","Data":"af3078a57f9ce6f5fa1dbe62a1651a0ec5050d34bed4021c22619f94fd24f094"} Mar 10 14:22:20 crc kubenswrapper[4911]: I0310 14:22:20.363582 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af3078a57f9ce6f5fa1dbe62a1651a0ec5050d34bed4021c22619f94fd24f094" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.340876 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kgtcj"] Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.347366 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kgtcj"] Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.421893 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jt4pw"] Mar 10 14:22:21 crc kubenswrapper[4911]: E0310 14:22:21.422300 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a50823-d870-43f3-8599-ac8cf13ce7c8" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422313 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a50823-d870-43f3-8599-ac8cf13ce7c8" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: E0310 14:22:21.422328 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f1f748-2a8f-4786-907a-6800ad1c999c" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422334 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f1f748-2a8f-4786-907a-6800ad1c999c" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: E0310 14:22:21.422348 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8905bd52-ff68-4e7c-818f-bf2a38b80b8f" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422354 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8905bd52-ff68-4e7c-818f-bf2a38b80b8f" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: E0310 14:22:21.422373 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bda7dfa-e3a9-4f11-9872-b83c855f7df6" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422378 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bda7dfa-e3a9-4f11-9872-b83c855f7df6" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: E0310 14:22:21.422390 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce6d89e-51f0-4222-96d1-2d39f09b0e35" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422396 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce6d89e-51f0-4222-96d1-2d39f09b0e35" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: E0310 14:22:21.422410 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f3b5bd-f962-4c8b-a792-01badc31a9d4" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422416 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f3b5bd-f962-4c8b-a792-01badc31a9d4" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: E0310 14:22:21.422427 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16670393-24c4-4df7-b240-0509b1866a1a" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422432 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="16670393-24c4-4df7-b240-0509b1866a1a" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422589 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f3b5bd-f962-4c8b-a792-01badc31a9d4" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422603 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a50823-d870-43f3-8599-ac8cf13ce7c8" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422614 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8905bd52-ff68-4e7c-818f-bf2a38b80b8f" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422621 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bda7dfa-e3a9-4f11-9872-b83c855f7df6" containerName="mariadb-database-create" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422631 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="16670393-24c4-4df7-b240-0509b1866a1a" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422641 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce6d89e-51f0-4222-96d1-2d39f09b0e35" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.422649 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f1f748-2a8f-4786-907a-6800ad1c999c" containerName="mariadb-account-create-update" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.423316 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.427894 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.485942 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jt4pw"] Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.536665 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-operator-scripts\") pod \"root-account-create-update-jt4pw\" (UID: \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\") " pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.536802 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278v5\" (UniqueName: \"kubernetes.io/projected/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-kube-api-access-278v5\") pod \"root-account-create-update-jt4pw\" (UID: \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\") " pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.639184 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-278v5\" (UniqueName: \"kubernetes.io/projected/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-kube-api-access-278v5\") pod \"root-account-create-update-jt4pw\" (UID: \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\") " pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.639438 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-operator-scripts\") pod \"root-account-create-update-jt4pw\" (UID: \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\") " pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.640217 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-operator-scripts\") pod \"root-account-create-update-jt4pw\" (UID: \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\") " pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.659778 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-278v5\" (UniqueName: \"kubernetes.io/projected/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-kube-api-access-278v5\") pod \"root-account-create-update-jt4pw\" (UID: \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\") " pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:21 crc kubenswrapper[4911]: I0310 14:22:21.742318 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:22 crc kubenswrapper[4911]: W0310 14:22:22.186905 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bb43f04_0674_4ce2_aaf5_3965d755f8fe.slice/crio-3b1839d40c4fa18d87da37e1297e1444b82990109625fffd68bf6ff3c828189d WatchSource:0}: Error finding container 3b1839d40c4fa18d87da37e1297e1444b82990109625fffd68bf6ff3c828189d: Status 404 returned error can't find the container with id 3b1839d40c4fa18d87da37e1297e1444b82990109625fffd68bf6ff3c828189d Mar 10 14:22:22 crc kubenswrapper[4911]: I0310 14:22:22.187888 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jt4pw"] Mar 10 14:22:22 crc kubenswrapper[4911]: I0310 14:22:22.212749 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce6d89e-51f0-4222-96d1-2d39f09b0e35" path="/var/lib/kubelet/pods/4ce6d89e-51f0-4222-96d1-2d39f09b0e35/volumes" Mar 10 14:22:22 crc kubenswrapper[4911]: I0310 14:22:22.368271 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:22:22 crc kubenswrapper[4911]: I0310 14:22:22.382207 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jt4pw" event={"ID":"9bb43f04-0674-4ce2-aaf5-3965d755f8fe","Type":"ContainerStarted","Data":"3b1839d40c4fa18d87da37e1297e1444b82990109625fffd68bf6ff3c828189d"} Mar 10 14:22:22 crc kubenswrapper[4911]: I0310 14:22:22.443518 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-m56p6"] Mar 10 14:22:22 crc kubenswrapper[4911]: I0310 14:22:22.443817 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" podUID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerName="dnsmasq-dns" containerID="cri-o://f1a625682d9007d83f3897de06ca8fbe0bf81593321b4a0d7bb52c0fd3aa6d15" gracePeriod=10 Mar 10 14:22:22 crc kubenswrapper[4911]: I0310 14:22:22.446369 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:23 crc kubenswrapper[4911]: I0310 14:22:23.392188 4911 generic.go:334] "Generic (PLEG): container finished" podID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerID="f1a625682d9007d83f3897de06ca8fbe0bf81593321b4a0d7bb52c0fd3aa6d15" exitCode=0 Mar 10 14:22:23 crc kubenswrapper[4911]: I0310 14:22:23.392274 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" event={"ID":"e7c625c5-1f3d-4f46-953b-dccee1d15d0e","Type":"ContainerDied","Data":"f1a625682d9007d83f3897de06ca8fbe0bf81593321b4a0d7bb52c0fd3aa6d15"} Mar 10 14:22:23 crc kubenswrapper[4911]: I0310 14:22:23.788174 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:23 crc kubenswrapper[4911]: E0310 14:22:23.788514 4911 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 14:22:23 crc kubenswrapper[4911]: E0310 14:22:23.788565 4911 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 14:22:23 crc kubenswrapper[4911]: E0310 14:22:23.788659 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift podName:2ca33eab-721d-4858-8e23-9ffc6371926f nodeName:}" failed. No retries permitted until 2026-03-10 14:22:39.788627394 +0000 UTC m=+1264.352147321 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift") pod "swift-storage-0" (UID: "2ca33eab-721d-4858-8e23-9ffc6371926f") : configmap "swift-ring-files" not found Mar 10 14:22:24 crc kubenswrapper[4911]: I0310 14:22:24.405050 4911 generic.go:334] "Generic (PLEG): container finished" podID="4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" containerID="7d870905f885f9b5f096df151789c78e888b9e4738b3400aeef9bde6566506ce" exitCode=0 Mar 10 14:22:24 crc kubenswrapper[4911]: I0310 14:22:24.405121 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hkqgc" event={"ID":"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a","Type":"ContainerDied","Data":"7d870905f885f9b5f096df151789c78e888b9e4738b3400aeef9bde6566506ce"} Mar 10 14:22:24 crc kubenswrapper[4911]: I0310 14:22:24.914901 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6v9wc"] Mar 10 14:22:24 crc kubenswrapper[4911]: I0310 14:22:24.916771 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:24 crc kubenswrapper[4911]: I0310 14:22:24.920584 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 14:22:24 crc kubenswrapper[4911]: I0310 14:22:24.920855 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-94ddw" Mar 10 14:22:24 crc kubenswrapper[4911]: I0310 14:22:24.921853 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6v9wc"] Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.011073 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-combined-ca-bundle\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.011233 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-db-sync-config-data\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.011271 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-config-data\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.011334 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwq8\" (UniqueName: \"kubernetes.io/projected/a49be6ea-3e0f-4975-89ff-9cfd022e4980-kube-api-access-krwq8\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.113374 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-combined-ca-bundle\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.113585 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-db-sync-config-data\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.114601 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-config-data\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.114685 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwq8\" (UniqueName: \"kubernetes.io/projected/a49be6ea-3e0f-4975-89ff-9cfd022e4980-kube-api-access-krwq8\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.120553 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-combined-ca-bundle\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.120888 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-config-data\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.121967 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-db-sync-config-data\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.140014 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwq8\" (UniqueName: \"kubernetes.io/projected/a49be6ea-3e0f-4975-89ff-9cfd022e4980-kube-api-access-krwq8\") pod \"glance-db-sync-6v9wc\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:25 crc kubenswrapper[4911]: I0310 14:22:25.242921 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:25.765891 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6v9wc"] Mar 10 14:22:28 crc kubenswrapper[4911]: W0310 14:22:25.771034 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda49be6ea_3e0f_4975_89ff_9cfd022e4980.slice/crio-8977a38cbaaf32eed51d32da6ec27ec80d42ffebe5949c913ded16c3750bfca1 WatchSource:0}: Error finding container 8977a38cbaaf32eed51d32da6ec27ec80d42ffebe5949c913ded16c3750bfca1: Status 404 returned error can't find the container with id 8977a38cbaaf32eed51d32da6ec27ec80d42ffebe5949c913ded16c3750bfca1 Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:26.427109 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6v9wc" event={"ID":"a49be6ea-3e0f-4975-89ff-9cfd022e4980","Type":"ContainerStarted","Data":"8977a38cbaaf32eed51d32da6ec27ec80d42ffebe5949c913ded16c3750bfca1"} Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:26.859244 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" podUID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:27.437716 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jt4pw" event={"ID":"9bb43f04-0674-4ce2-aaf5-3965d755f8fe","Type":"ContainerStarted","Data":"6b0ebb6cc7e4572fdfb2020007d6495d715b0e01cbc5c9fc28ec757fb61c0de8"} Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:27.463678 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-jt4pw" podStartSLOduration=6.463651469 podStartE2EDuration="6.463651469s" podCreationTimestamp="2026-03-10 14:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:27.456331516 +0000 UTC m=+1252.019851443" watchObservedRunningTime="2026-03-10 14:22:27.463651469 +0000 UTC m=+1252.027171386" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.465934 4911 generic.go:334] "Generic (PLEG): container finished" podID="9bb43f04-0674-4ce2-aaf5-3965d755f8fe" containerID="6b0ebb6cc7e4572fdfb2020007d6495d715b0e01cbc5c9fc28ec757fb61c0de8" exitCode=0 Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.466032 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jt4pw" event={"ID":"9bb43f04-0674-4ce2-aaf5-3965d755f8fe","Type":"ContainerDied","Data":"6b0ebb6cc7e4572fdfb2020007d6495d715b0e01cbc5c9fc28ec757fb61c0de8"} Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.794546 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.801532 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.905962 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-scripts\") pod \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906063 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvrvk\" (UniqueName: \"kubernetes.io/projected/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-kube-api-access-dvrvk\") pod \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906120 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-combined-ca-bundle\") pod \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906150 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-ring-data-devices\") pod \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906220 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-dispersionconf\") pod \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906250 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-config\") pod \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906279 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-ovsdbserver-nb\") pod \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906386 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-swiftconf\") pod \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906414 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-dns-svc\") pod \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\" (UID: \"e7c625c5-1f3d-4f46-953b-dccee1d15d0e\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906508 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjb5w\" (UniqueName: \"kubernetes.io/projected/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-kube-api-access-hjb5w\") pod \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.906618 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-etc-swift\") pod \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\" (UID: \"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a\") " Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.907087 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" (UID: "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.907578 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" (UID: "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.914246 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-kube-api-access-dvrvk" (OuterVolumeSpecName: "kube-api-access-dvrvk") pod "e7c625c5-1f3d-4f46-953b-dccee1d15d0e" (UID: "e7c625c5-1f3d-4f46-953b-dccee1d15d0e"). InnerVolumeSpecName "kube-api-access-dvrvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.914341 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-kube-api-access-hjb5w" (OuterVolumeSpecName: "kube-api-access-hjb5w") pod "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" (UID: "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a"). InnerVolumeSpecName "kube-api-access-hjb5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.917068 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" (UID: "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.933467 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-scripts" (OuterVolumeSpecName: "scripts") pod "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" (UID: "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.933634 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" (UID: "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.935318 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" (UID: "4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.950949 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7c625c5-1f3d-4f46-953b-dccee1d15d0e" (UID: "e7c625c5-1f3d-4f46-953b-dccee1d15d0e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.951688 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-config" (OuterVolumeSpecName: "config") pod "e7c625c5-1f3d-4f46-953b-dccee1d15d0e" (UID: "e7c625c5-1f3d-4f46-953b-dccee1d15d0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:28 crc kubenswrapper[4911]: I0310 14:22:28.963251 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7c625c5-1f3d-4f46-953b-dccee1d15d0e" (UID: "e7c625c5-1f3d-4f46-953b-dccee1d15d0e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009221 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009265 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvrvk\" (UniqueName: \"kubernetes.io/projected/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-kube-api-access-dvrvk\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009283 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009293 4911 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009304 4911 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009316 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009328 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009338 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c625c5-1f3d-4f46-953b-dccee1d15d0e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009349 4911 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009357 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjb5w\" (UniqueName: \"kubernetes.io/projected/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-kube-api-access-hjb5w\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.009365 4911 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.481070 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" event={"ID":"e7c625c5-1f3d-4f46-953b-dccee1d15d0e","Type":"ContainerDied","Data":"5b01766528d7864a8858aea2de722c2e1cb917a6dd7cc2ab5f59f1477c76ad96"} Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.481081 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-m56p6" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.481145 4911 scope.go:117] "RemoveContainer" containerID="f1a625682d9007d83f3897de06ca8fbe0bf81593321b4a0d7bb52c0fd3aa6d15" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.485215 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hkqgc" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.485182 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hkqgc" event={"ID":"4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a","Type":"ContainerDied","Data":"c2808e91aeda496662e8149373b3ba99eff11e1dca92a4fc36463d888b313966"} Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.485261 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2808e91aeda496662e8149373b3ba99eff11e1dca92a4fc36463d888b313966" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.514108 4911 scope.go:117] "RemoveContainer" containerID="aaa7ff7f4614508efee5305acb9e2bcdef1c856851cd8fb1046ee40e84d87fef" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.522115 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-m56p6"] Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.536835 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-m56p6"] Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.782436 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.823766 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-278v5\" (UniqueName: \"kubernetes.io/projected/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-kube-api-access-278v5\") pod \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\" (UID: \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\") " Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.823861 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-operator-scripts\") pod \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\" (UID: \"9bb43f04-0674-4ce2-aaf5-3965d755f8fe\") " Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.825604 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9bb43f04-0674-4ce2-aaf5-3965d755f8fe" (UID: "9bb43f04-0674-4ce2-aaf5-3965d755f8fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.829288 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-kube-api-access-278v5" (OuterVolumeSpecName: "kube-api-access-278v5") pod "9bb43f04-0674-4ce2-aaf5-3965d755f8fe" (UID: "9bb43f04-0674-4ce2-aaf5-3965d755f8fe"). InnerVolumeSpecName "kube-api-access-278v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.926619 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-278v5\" (UniqueName: \"kubernetes.io/projected/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-kube-api-access-278v5\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:29 crc kubenswrapper[4911]: I0310 14:22:29.926997 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb43f04-0674-4ce2-aaf5-3965d755f8fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:30 crc kubenswrapper[4911]: I0310 14:22:30.246481 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" path="/var/lib/kubelet/pods/e7c625c5-1f3d-4f46-953b-dccee1d15d0e/volumes" Mar 10 14:22:30 crc kubenswrapper[4911]: I0310 14:22:30.499866 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jt4pw" event={"ID":"9bb43f04-0674-4ce2-aaf5-3965d755f8fe","Type":"ContainerDied","Data":"3b1839d40c4fa18d87da37e1297e1444b82990109625fffd68bf6ff3c828189d"} Mar 10 14:22:30 crc kubenswrapper[4911]: I0310 14:22:30.499920 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b1839d40c4fa18d87da37e1297e1444b82990109625fffd68bf6ff3c828189d" Mar 10 14:22:30 crc kubenswrapper[4911]: I0310 14:22:30.499989 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jt4pw" Mar 10 14:22:32 crc kubenswrapper[4911]: I0310 14:22:32.730077 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 14:22:33 crc kubenswrapper[4911]: I0310 14:22:33.008413 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jt4pw"] Mar 10 14:22:33 crc kubenswrapper[4911]: I0310 14:22:33.017095 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jt4pw"] Mar 10 14:22:34 crc kubenswrapper[4911]: I0310 14:22:34.209343 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb43f04-0674-4ce2-aaf5-3965d755f8fe" path="/var/lib/kubelet/pods/9bb43f04-0674-4ce2-aaf5-3965d755f8fe/volumes" Mar 10 14:22:34 crc kubenswrapper[4911]: I0310 14:22:34.681097 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9vssn" podUID="e43fdd12-0361-428c-8318-d1cec1c95399" containerName="ovn-controller" probeResult="failure" output=< Mar 10 14:22:34 crc kubenswrapper[4911]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 14:22:34 crc kubenswrapper[4911]: > Mar 10 14:22:34 crc kubenswrapper[4911]: I0310 14:22:34.803329 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:22:34 crc kubenswrapper[4911]: I0310 14:22:34.807239 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5hbsq" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.059435 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9vssn-config-rz99p"] Mar 10 14:22:35 crc kubenswrapper[4911]: E0310 14:22:35.060070 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerName="init" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.060096 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerName="init" Mar 10 14:22:35 crc kubenswrapper[4911]: E0310 14:22:35.060115 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerName="dnsmasq-dns" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.060125 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerName="dnsmasq-dns" Mar 10 14:22:35 crc kubenswrapper[4911]: E0310 14:22:35.060146 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" containerName="swift-ring-rebalance" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.060158 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" containerName="swift-ring-rebalance" Mar 10 14:22:35 crc kubenswrapper[4911]: E0310 14:22:35.060171 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb43f04-0674-4ce2-aaf5-3965d755f8fe" containerName="mariadb-account-create-update" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.060179 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb43f04-0674-4ce2-aaf5-3965d755f8fe" containerName="mariadb-account-create-update" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.060388 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb43f04-0674-4ce2-aaf5-3965d755f8fe" containerName="mariadb-account-create-update" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.060417 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a" containerName="swift-ring-rebalance" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.060427 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c625c5-1f3d-4f46-953b-dccee1d15d0e" containerName="dnsmasq-dns" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.061485 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.065047 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.080269 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9vssn-config-rz99p"] Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.129227 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-scripts\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.129290 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-additional-scripts\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.129315 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7wkb\" (UniqueName: \"kubernetes.io/projected/cdebaba1-f91c-4094-a25c-e09177d676e1-kube-api-access-c7wkb\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.129426 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-log-ovn\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.129474 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run-ovn\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.129499 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.230694 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-scripts\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.230777 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-additional-scripts\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.230811 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7wkb\" (UniqueName: \"kubernetes.io/projected/cdebaba1-f91c-4094-a25c-e09177d676e1-kube-api-access-c7wkb\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.230892 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-log-ovn\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.230913 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run-ovn\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.230934 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.231284 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.231324 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-log-ovn\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.231325 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run-ovn\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.232208 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-additional-scripts\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.232711 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-scripts\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.252364 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7wkb\" (UniqueName: \"kubernetes.io/projected/cdebaba1-f91c-4094-a25c-e09177d676e1-kube-api-access-c7wkb\") pod \"ovn-controller-9vssn-config-rz99p\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:35 crc kubenswrapper[4911]: I0310 14:22:35.382760 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.027286 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7fwgv"] Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.028610 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.035107 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7fwgv"] Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.035344 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.096221 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btvq5\" (UniqueName: \"kubernetes.io/projected/83e91396-f5e0-46f1-99ce-394c03a93db3-kube-api-access-btvq5\") pod \"root-account-create-update-7fwgv\" (UID: \"83e91396-f5e0-46f1-99ce-394c03a93db3\") " pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.096334 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83e91396-f5e0-46f1-99ce-394c03a93db3-operator-scripts\") pod \"root-account-create-update-7fwgv\" (UID: \"83e91396-f5e0-46f1-99ce-394c03a93db3\") " pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.198388 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btvq5\" (UniqueName: \"kubernetes.io/projected/83e91396-f5e0-46f1-99ce-394c03a93db3-kube-api-access-btvq5\") pod \"root-account-create-update-7fwgv\" (UID: \"83e91396-f5e0-46f1-99ce-394c03a93db3\") " pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.198468 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83e91396-f5e0-46f1-99ce-394c03a93db3-operator-scripts\") pod \"root-account-create-update-7fwgv\" (UID: \"83e91396-f5e0-46f1-99ce-394c03a93db3\") " pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.199444 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83e91396-f5e0-46f1-99ce-394c03a93db3-operator-scripts\") pod \"root-account-create-update-7fwgv\" (UID: \"83e91396-f5e0-46f1-99ce-394c03a93db3\") " pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.223349 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btvq5\" (UniqueName: \"kubernetes.io/projected/83e91396-f5e0-46f1-99ce-394c03a93db3-kube-api-access-btvq5\") pod \"root-account-create-update-7fwgv\" (UID: \"83e91396-f5e0-46f1-99ce-394c03a93db3\") " pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:38 crc kubenswrapper[4911]: I0310 14:22:38.354563 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:39 crc kubenswrapper[4911]: I0310 14:22:39.690985 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9vssn" podUID="e43fdd12-0361-428c-8318-d1cec1c95399" containerName="ovn-controller" probeResult="failure" output=< Mar 10 14:22:39 crc kubenswrapper[4911]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 14:22:39 crc kubenswrapper[4911]: > Mar 10 14:22:39 crc kubenswrapper[4911]: I0310 14:22:39.831430 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:39 crc kubenswrapper[4911]: I0310 14:22:39.837870 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ca33eab-721d-4858-8e23-9ffc6371926f-etc-swift\") pod \"swift-storage-0\" (UID: \"2ca33eab-721d-4858-8e23-9ffc6371926f\") " pod="openstack/swift-storage-0" Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.006354 4911 scope.go:117] "RemoveContainer" containerID="11a15641096229e69f468a09c68f7a8b99e011a182675b73fe8fecf1cbfaf421" Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.008489 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9vssn-config-rz99p"] Mar 10 14:22:40 crc kubenswrapper[4911]: W0310 14:22:40.014587 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdebaba1_f91c_4094_a25c_e09177d676e1.slice/crio-e778250a5d6198831d53186021c0dac2fb6f1581acee209768b1ff6955d8f1d7 WatchSource:0}: Error finding container e778250a5d6198831d53186021c0dac2fb6f1581acee209768b1ff6955d8f1d7: Status 404 returned error can't find the container with id e778250a5d6198831d53186021c0dac2fb6f1581acee209768b1ff6955d8f1d7 Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.076387 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.099431 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7fwgv"] Mar 10 14:22:40 crc kubenswrapper[4911]: W0310 14:22:40.112794 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83e91396_f5e0_46f1_99ce_394c03a93db3.slice/crio-cad49a999c4ac42fe503cad05554d7fc992f9eceb4a85df89c951d38eea17e8d WatchSource:0}: Error finding container cad49a999c4ac42fe503cad05554d7fc992f9eceb4a85df89c951d38eea17e8d: Status 404 returned error can't find the container with id cad49a999c4ac42fe503cad05554d7fc992f9eceb4a85df89c951d38eea17e8d Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.442451 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.600552 4911 generic.go:334] "Generic (PLEG): container finished" podID="83e91396-f5e0-46f1-99ce-394c03a93db3" containerID="4271682c8fca9a85282ff7eeb6a7eae6f6496bed656befdc203c85d336bfc4a8" exitCode=0 Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.600611 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7fwgv" event={"ID":"83e91396-f5e0-46f1-99ce-394c03a93db3","Type":"ContainerDied","Data":"4271682c8fca9a85282ff7eeb6a7eae6f6496bed656befdc203c85d336bfc4a8"} Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.600640 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7fwgv" event={"ID":"83e91396-f5e0-46f1-99ce-394c03a93db3","Type":"ContainerStarted","Data":"cad49a999c4ac42fe503cad05554d7fc992f9eceb4a85df89c951d38eea17e8d"} Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.602628 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6v9wc" event={"ID":"a49be6ea-3e0f-4975-89ff-9cfd022e4980","Type":"ContainerStarted","Data":"e36118380955427affb8dc83b7d5aac3c92cfe0618dd0fd5bc1e0373caf12157"} Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.606739 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"83592559194f1e91313be2ea755d3070343117d24076e3ab2fcf45eca5c45b1c"} Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.608600 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn-config-rz99p" event={"ID":"cdebaba1-f91c-4094-a25c-e09177d676e1","Type":"ContainerStarted","Data":"75a1fa23f4e3cfb65b887bc4bbe3566742e62d45a418da5164eb20a5fde2529c"} Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.608626 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn-config-rz99p" event={"ID":"cdebaba1-f91c-4094-a25c-e09177d676e1","Type":"ContainerStarted","Data":"e778250a5d6198831d53186021c0dac2fb6f1581acee209768b1ff6955d8f1d7"} Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.648563 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9vssn-config-rz99p" podStartSLOduration=5.648537873 podStartE2EDuration="5.648537873s" podCreationTimestamp="2026-03-10 14:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:40.6420172 +0000 UTC m=+1265.205537117" watchObservedRunningTime="2026-03-10 14:22:40.648537873 +0000 UTC m=+1265.212057790" Mar 10 14:22:40 crc kubenswrapper[4911]: I0310 14:22:40.669605 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6v9wc" podStartSLOduration=2.744324533 podStartE2EDuration="16.669581598s" podCreationTimestamp="2026-03-10 14:22:24 +0000 UTC" firstStartedPulling="2026-03-10 14:22:25.774484924 +0000 UTC m=+1250.338004841" lastFinishedPulling="2026-03-10 14:22:39.699741989 +0000 UTC m=+1264.263261906" observedRunningTime="2026-03-10 14:22:40.66484741 +0000 UTC m=+1265.228367327" watchObservedRunningTime="2026-03-10 14:22:40.669581598 +0000 UTC m=+1265.233101515" Mar 10 14:22:41 crc kubenswrapper[4911]: I0310 14:22:41.622464 4911 generic.go:334] "Generic (PLEG): container finished" podID="cdebaba1-f91c-4094-a25c-e09177d676e1" containerID="75a1fa23f4e3cfb65b887bc4bbe3566742e62d45a418da5164eb20a5fde2529c" exitCode=0 Mar 10 14:22:41 crc kubenswrapper[4911]: I0310 14:22:41.622563 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn-config-rz99p" event={"ID":"cdebaba1-f91c-4094-a25c-e09177d676e1","Type":"ContainerDied","Data":"75a1fa23f4e3cfb65b887bc4bbe3566742e62d45a418da5164eb20a5fde2529c"} Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.108063 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.170267 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btvq5\" (UniqueName: \"kubernetes.io/projected/83e91396-f5e0-46f1-99ce-394c03a93db3-kube-api-access-btvq5\") pod \"83e91396-f5e0-46f1-99ce-394c03a93db3\" (UID: \"83e91396-f5e0-46f1-99ce-394c03a93db3\") " Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.170534 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83e91396-f5e0-46f1-99ce-394c03a93db3-operator-scripts\") pod \"83e91396-f5e0-46f1-99ce-394c03a93db3\" (UID: \"83e91396-f5e0-46f1-99ce-394c03a93db3\") " Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.171508 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e91396-f5e0-46f1-99ce-394c03a93db3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83e91396-f5e0-46f1-99ce-394c03a93db3" (UID: "83e91396-f5e0-46f1-99ce-394c03a93db3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.175294 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e91396-f5e0-46f1-99ce-394c03a93db3-kube-api-access-btvq5" (OuterVolumeSpecName: "kube-api-access-btvq5") pod "83e91396-f5e0-46f1-99ce-394c03a93db3" (UID: "83e91396-f5e0-46f1-99ce-394c03a93db3"). InnerVolumeSpecName "kube-api-access-btvq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.272786 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83e91396-f5e0-46f1-99ce-394c03a93db3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.272814 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btvq5\" (UniqueName: \"kubernetes.io/projected/83e91396-f5e0-46f1-99ce-394c03a93db3-kube-api-access-btvq5\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.633547 4911 generic.go:334] "Generic (PLEG): container finished" podID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerID="6a0ccf7ab32bbdd3db16a7cb4519b544bcfd41ff2a2f3f99c3689e09324d90f1" exitCode=0 Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.633618 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745","Type":"ContainerDied","Data":"6a0ccf7ab32bbdd3db16a7cb4519b544bcfd41ff2a2f3f99c3689e09324d90f1"} Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.636615 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d1a8c73-283d-431f-bfd3-af06ca3c60ff","Type":"ContainerDied","Data":"59082a05080778cdaeafa7755b2401054688c324364ece5a1d2c398ee3e47500"} Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.636569 4911 generic.go:334] "Generic (PLEG): container finished" podID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerID="59082a05080778cdaeafa7755b2401054688c324364ece5a1d2c398ee3e47500" exitCode=0 Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.651655 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"020765f0f19da4fd3aed5e9dcca5f2298c53aaeead8bd9a947f8c1f2bffc1a68"} Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.651763 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"8d5f20ab107a1017e2392b1b80327458146bb9d5b46c943e9c35f94027e07afa"} Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.651784 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"881240a8f85d898237b916e70d7821eb891933c26b3bce3632423a85007c1543"} Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.651798 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"075e0aae7b1db6a24da9ac7c057c0102ee581a569073d54e9e0ebc215a1ad790"} Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.654826 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7fwgv" Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.655454 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7fwgv" event={"ID":"83e91396-f5e0-46f1-99ce-394c03a93db3","Type":"ContainerDied","Data":"cad49a999c4ac42fe503cad05554d7fc992f9eceb4a85df89c951d38eea17e8d"} Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.655526 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cad49a999c4ac42fe503cad05554d7fc992f9eceb4a85df89c951d38eea17e8d" Mar 10 14:22:42 crc kubenswrapper[4911]: I0310 14:22:42.937589 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.087875 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7wkb\" (UniqueName: \"kubernetes.io/projected/cdebaba1-f91c-4094-a25c-e09177d676e1-kube-api-access-c7wkb\") pod \"cdebaba1-f91c-4094-a25c-e09177d676e1\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.088319 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-scripts\") pod \"cdebaba1-f91c-4094-a25c-e09177d676e1\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.088460 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run-ovn\") pod \"cdebaba1-f91c-4094-a25c-e09177d676e1\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.088582 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-log-ovn\") pod \"cdebaba1-f91c-4094-a25c-e09177d676e1\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.088613 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "cdebaba1-f91c-4094-a25c-e09177d676e1" (UID: "cdebaba1-f91c-4094-a25c-e09177d676e1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.088682 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "cdebaba1-f91c-4094-a25c-e09177d676e1" (UID: "cdebaba1-f91c-4094-a25c-e09177d676e1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.088853 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-additional-scripts\") pod \"cdebaba1-f91c-4094-a25c-e09177d676e1\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.088944 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run\") pod \"cdebaba1-f91c-4094-a25c-e09177d676e1\" (UID: \"cdebaba1-f91c-4094-a25c-e09177d676e1\") " Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.089392 4911 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.089462 4911 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.088970 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run" (OuterVolumeSpecName: "var-run") pod "cdebaba1-f91c-4094-a25c-e09177d676e1" (UID: "cdebaba1-f91c-4094-a25c-e09177d676e1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.089332 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-scripts" (OuterVolumeSpecName: "scripts") pod "cdebaba1-f91c-4094-a25c-e09177d676e1" (UID: "cdebaba1-f91c-4094-a25c-e09177d676e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.089359 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "cdebaba1-f91c-4094-a25c-e09177d676e1" (UID: "cdebaba1-f91c-4094-a25c-e09177d676e1"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.093578 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdebaba1-f91c-4094-a25c-e09177d676e1-kube-api-access-c7wkb" (OuterVolumeSpecName: "kube-api-access-c7wkb") pod "cdebaba1-f91c-4094-a25c-e09177d676e1" (UID: "cdebaba1-f91c-4094-a25c-e09177d676e1"). InnerVolumeSpecName "kube-api-access-c7wkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.126103 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9vssn-config-rz99p"] Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.136916 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9vssn-config-rz99p"] Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.201364 4911 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cdebaba1-f91c-4094-a25c-e09177d676e1-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.201604 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7wkb\" (UniqueName: \"kubernetes.io/projected/cdebaba1-f91c-4094-a25c-e09177d676e1-kube-api-access-c7wkb\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.201699 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.201717 4911 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cdebaba1-f91c-4094-a25c-e09177d676e1-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.216120 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9vssn-config-77lxm"] Mar 10 14:22:43 crc kubenswrapper[4911]: E0310 14:22:43.216804 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdebaba1-f91c-4094-a25c-e09177d676e1" containerName="ovn-config" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.216882 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdebaba1-f91c-4094-a25c-e09177d676e1" containerName="ovn-config" Mar 10 14:22:43 crc kubenswrapper[4911]: E0310 14:22:43.216964 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e91396-f5e0-46f1-99ce-394c03a93db3" containerName="mariadb-account-create-update" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.217016 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e91396-f5e0-46f1-99ce-394c03a93db3" containerName="mariadb-account-create-update" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.217214 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdebaba1-f91c-4094-a25c-e09177d676e1" containerName="ovn-config" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.217295 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e91396-f5e0-46f1-99ce-394c03a93db3" containerName="mariadb-account-create-update" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.217985 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.242257 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9vssn-config-77lxm"] Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.303771 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-scripts\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.303836 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-log-ovn\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.303873 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.303893 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run-ovn\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.303972 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrbq\" (UniqueName: \"kubernetes.io/projected/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-kube-api-access-vvrbq\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.304055 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-additional-scripts\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.405697 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-additional-scripts\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.405785 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-scripts\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.405820 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-log-ovn\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.405856 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.405883 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run-ovn\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.405948 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrbq\" (UniqueName: \"kubernetes.io/projected/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-kube-api-access-vvrbq\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.406347 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-log-ovn\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.406347 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.406977 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-additional-scripts\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.408299 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-scripts\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.408430 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run-ovn\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.424907 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrbq\" (UniqueName: \"kubernetes.io/projected/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-kube-api-access-vvrbq\") pod \"ovn-controller-9vssn-config-77lxm\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.536451 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.677658 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e778250a5d6198831d53186021c0dac2fb6f1581acee209768b1ff6955d8f1d7" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.678085 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn-config-rz99p" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.696660 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745","Type":"ContainerStarted","Data":"debb01c86ed7af6a4695794fce229ff3866fbbfc081c0a593c3677405f2a6730"} Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.697485 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.710967 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d1a8c73-283d-431f-bfd3-af06ca3c60ff","Type":"ContainerStarted","Data":"eee414a09d2e6bbf0705352ff16ac4aa998f7c1ace07d1bc528a4da897d60418"} Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.712016 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.765261 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.772569248 podStartE2EDuration="1m13.76523696s" podCreationTimestamp="2026-03-10 14:21:30 +0000 UTC" firstStartedPulling="2026-03-10 14:21:38.81369097 +0000 UTC m=+1203.377210907" lastFinishedPulling="2026-03-10 14:22:07.806358702 +0000 UTC m=+1232.369878619" observedRunningTime="2026-03-10 14:22:43.727803726 +0000 UTC m=+1268.291323653" watchObservedRunningTime="2026-03-10 14:22:43.76523696 +0000 UTC m=+1268.328756877" Mar 10 14:22:43 crc kubenswrapper[4911]: I0310 14:22:43.768789 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371963.086063 podStartE2EDuration="1m13.768711917s" podCreationTimestamp="2026-03-10 14:21:30 +0000 UTC" firstStartedPulling="2026-03-10 14:21:32.440616988 +0000 UTC m=+1197.004136895" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:43.763774284 +0000 UTC m=+1268.327294211" watchObservedRunningTime="2026-03-10 14:22:43.768711917 +0000 UTC m=+1268.332231834" Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.119361 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9vssn-config-77lxm"] Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.213572 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdebaba1-f91c-4094-a25c-e09177d676e1" path="/var/lib/kubelet/pods/cdebaba1-f91c-4094-a25c-e09177d676e1/volumes" Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.707502 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9vssn" Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.749197 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"698e0af0d2f5f1ba4c7f0b7421891073c8d460028488886b786ec27e92f7fbce"} Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.749268 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"49d0e7c3913b1276b0c2c652d8ca38b2a3a04760dcb9f5e2c155f1cf45930d68"} Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.749288 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"f4d7283ddd36a9b9002333bbac8d41f37d7bee3bca3d0d666d9045b2114c7901"} Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.760364 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn-config-77lxm" event={"ID":"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17","Type":"ContainerStarted","Data":"fddef34e1538307a3db7a9a01521f575c59c430239c11ffc64af7724dfe4fbca"} Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.760443 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn-config-77lxm" event={"ID":"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17","Type":"ContainerStarted","Data":"36d20b2de850d1a7655b77effb21b00adcb173a786dede9d7bb52638f8a2e26d"} Mar 10 14:22:44 crc kubenswrapper[4911]: I0310 14:22:44.782924 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9vssn-config-77lxm" podStartSLOduration=1.782891883 podStartE2EDuration="1.782891883s" podCreationTimestamp="2026-03-10 14:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:44.781776525 +0000 UTC m=+1269.345296462" watchObservedRunningTime="2026-03-10 14:22:44.782891883 +0000 UTC m=+1269.346411800" Mar 10 14:22:45 crc kubenswrapper[4911]: I0310 14:22:45.771636 4911 generic.go:334] "Generic (PLEG): container finished" podID="a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" containerID="fddef34e1538307a3db7a9a01521f575c59c430239c11ffc64af7724dfe4fbca" exitCode=0 Mar 10 14:22:45 crc kubenswrapper[4911]: I0310 14:22:45.771744 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn-config-77lxm" event={"ID":"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17","Type":"ContainerDied","Data":"fddef34e1538307a3db7a9a01521f575c59c430239c11ffc64af7724dfe4fbca"} Mar 10 14:22:45 crc kubenswrapper[4911]: I0310 14:22:45.781357 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"94f04e9ae913a5680e0f4f7a859b18c7289ec1bbcf2ddeffd966cf1c81d026ba"} Mar 10 14:22:46 crc kubenswrapper[4911]: I0310 14:22:46.794717 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"6e582f21340b1db58e8f2615bc29df6365de3c3204d3ad3634552aabffbc3761"} Mar 10 14:22:46 crc kubenswrapper[4911]: I0310 14:22:46.795123 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"847e6dc0820f2ed783b15168003227049f45ace88391e849ed7e5d23058ae2e9"} Mar 10 14:22:46 crc kubenswrapper[4911]: I0310 14:22:46.795140 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"2cf6b33e37b1da282cbbfc77b1a1e1a4298fe48e56d02c1cc38529c25a6938b4"} Mar 10 14:22:46 crc kubenswrapper[4911]: I0310 14:22:46.795152 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"e5a5010b087de7a100baec4b85a5b96573f0bdfd8a56d5a220ddec582f47b9ab"} Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.195134 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.285692 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-log-ovn\") pod \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.285832 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" (UID: "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.285868 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-scripts\") pod \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.285999 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvrbq\" (UniqueName: \"kubernetes.io/projected/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-kube-api-access-vvrbq\") pod \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286038 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run-ovn\") pod \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286066 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run\") pod \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286122 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" (UID: "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286187 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-additional-scripts\") pod \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\" (UID: \"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17\") " Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286234 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run" (OuterVolumeSpecName: "var-run") pod "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" (UID: "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286857 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" (UID: "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286885 4911 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286908 4911 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.286922 4911 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.287096 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-scripts" (OuterVolumeSpecName: "scripts") pod "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" (UID: "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.290393 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-kube-api-access-vvrbq" (OuterVolumeSpecName: "kube-api-access-vvrbq") pod "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" (UID: "a6a28e78-c091-47c3-bbb4-7c8cc0e23c17"). InnerVolumeSpecName "kube-api-access-vvrbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.388762 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.388806 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvrbq\" (UniqueName: \"kubernetes.io/projected/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-kube-api-access-vvrbq\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.388820 4911 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.805090 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9vssn-config-77lxm" event={"ID":"a6a28e78-c091-47c3-bbb4-7c8cc0e23c17","Type":"ContainerDied","Data":"36d20b2de850d1a7655b77effb21b00adcb173a786dede9d7bb52638f8a2e26d"} Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.805473 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36d20b2de850d1a7655b77effb21b00adcb173a786dede9d7bb52638f8a2e26d" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.805132 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9vssn-config-77lxm" Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.812344 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"0d6c46b6ceaf71236cec025064530bf0b61c30775aee5aa7b9ff013e6e3336c3"} Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.812397 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"8e8d6808fdd9b07544fdd1677cf7720912d143aeb24dc00b32535ee363af430c"} Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.812409 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ca33eab-721d-4858-8e23-9ffc6371926f","Type":"ContainerStarted","Data":"53ad504da8648d9c3b215d6af92c4c8dcf7c14c975a1ab25739596b800938da4"} Mar 10 14:22:47 crc kubenswrapper[4911]: I0310 14:22:47.883966 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.43599874 podStartE2EDuration="41.883928179s" podCreationTimestamp="2026-03-10 14:22:06 +0000 UTC" firstStartedPulling="2026-03-10 14:22:40.451684289 +0000 UTC m=+1265.015204206" lastFinishedPulling="2026-03-10 14:22:45.899613728 +0000 UTC m=+1270.463133645" observedRunningTime="2026-03-10 14:22:47.874711889 +0000 UTC m=+1272.438231816" watchObservedRunningTime="2026-03-10 14:22:47.883928179 +0000 UTC m=+1272.447448096" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.205128 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4rfzv"] Mar 10 14:22:48 crc kubenswrapper[4911]: E0310 14:22:48.205562 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" containerName="ovn-config" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.205581 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" containerName="ovn-config" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.207351 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" containerName="ovn-config" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.208648 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.211077 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.230601 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4rfzv"] Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.296697 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9vssn-config-77lxm"] Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.303567 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9vssn-config-77lxm"] Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.303934 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-config\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.304043 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.304130 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.304159 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.304184 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxdt5\" (UniqueName: \"kubernetes.io/projected/e6cf2588-011c-44f8-a24c-984eb3b74809-kube-api-access-bxdt5\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.304248 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.406459 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.406612 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-config\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.406678 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.406747 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.406773 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.406803 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxdt5\" (UniqueName: \"kubernetes.io/projected/e6cf2588-011c-44f8-a24c-984eb3b74809-kube-api-access-bxdt5\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.407533 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.407999 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.407999 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.408141 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-config\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.408214 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.425413 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxdt5\" (UniqueName: \"kubernetes.io/projected/e6cf2588-011c-44f8-a24c-984eb3b74809-kube-api-access-bxdt5\") pod \"dnsmasq-dns-764c5664d7-4rfzv\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.528043 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.838286 4911 generic.go:334] "Generic (PLEG): container finished" podID="a49be6ea-3e0f-4975-89ff-9cfd022e4980" containerID="e36118380955427affb8dc83b7d5aac3c92cfe0618dd0fd5bc1e0373caf12157" exitCode=0 Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.838445 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6v9wc" event={"ID":"a49be6ea-3e0f-4975-89ff-9cfd022e4980","Type":"ContainerDied","Data":"e36118380955427affb8dc83b7d5aac3c92cfe0618dd0fd5bc1e0373caf12157"} Mar 10 14:22:48 crc kubenswrapper[4911]: I0310 14:22:48.991058 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4rfzv"] Mar 10 14:22:49 crc kubenswrapper[4911]: W0310 14:22:49.003204 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6cf2588_011c_44f8_a24c_984eb3b74809.slice/crio-ec40226046195e6afee2c2d868ab4b723d8218ba4d1dd60447474c7766210374 WatchSource:0}: Error finding container ec40226046195e6afee2c2d868ab4b723d8218ba4d1dd60447474c7766210374: Status 404 returned error can't find the container with id ec40226046195e6afee2c2d868ab4b723d8218ba4d1dd60447474c7766210374 Mar 10 14:22:49 crc kubenswrapper[4911]: I0310 14:22:49.848444 4911 generic.go:334] "Generic (PLEG): container finished" podID="e6cf2588-011c-44f8-a24c-984eb3b74809" containerID="5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6" exitCode=0 Mar 10 14:22:49 crc kubenswrapper[4911]: I0310 14:22:49.849560 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" event={"ID":"e6cf2588-011c-44f8-a24c-984eb3b74809","Type":"ContainerDied","Data":"5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6"} Mar 10 14:22:49 crc kubenswrapper[4911]: I0310 14:22:49.849768 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" event={"ID":"e6cf2588-011c-44f8-a24c-984eb3b74809","Type":"ContainerStarted","Data":"ec40226046195e6afee2c2d868ab4b723d8218ba4d1dd60447474c7766210374"} Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.205034 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a28e78-c091-47c3-bbb4-7c8cc0e23c17" path="/var/lib/kubelet/pods/a6a28e78-c091-47c3-bbb4-7c8cc0e23c17/volumes" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.278332 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.345029 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krwq8\" (UniqueName: \"kubernetes.io/projected/a49be6ea-3e0f-4975-89ff-9cfd022e4980-kube-api-access-krwq8\") pod \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.345239 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-config-data\") pod \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.345297 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-db-sync-config-data\") pod \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.345357 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-combined-ca-bundle\") pod \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\" (UID: \"a49be6ea-3e0f-4975-89ff-9cfd022e4980\") " Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.351596 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a49be6ea-3e0f-4975-89ff-9cfd022e4980" (UID: "a49be6ea-3e0f-4975-89ff-9cfd022e4980"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.363688 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49be6ea-3e0f-4975-89ff-9cfd022e4980-kube-api-access-krwq8" (OuterVolumeSpecName: "kube-api-access-krwq8") pod "a49be6ea-3e0f-4975-89ff-9cfd022e4980" (UID: "a49be6ea-3e0f-4975-89ff-9cfd022e4980"). InnerVolumeSpecName "kube-api-access-krwq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.385164 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a49be6ea-3e0f-4975-89ff-9cfd022e4980" (UID: "a49be6ea-3e0f-4975-89ff-9cfd022e4980"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.386648 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-config-data" (OuterVolumeSpecName: "config-data") pod "a49be6ea-3e0f-4975-89ff-9cfd022e4980" (UID: "a49be6ea-3e0f-4975-89ff-9cfd022e4980"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.447178 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.447446 4911 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.447524 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49be6ea-3e0f-4975-89ff-9cfd022e4980-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.447593 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krwq8\" (UniqueName: \"kubernetes.io/projected/a49be6ea-3e0f-4975-89ff-9cfd022e4980-kube-api-access-krwq8\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.858775 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" event={"ID":"e6cf2588-011c-44f8-a24c-984eb3b74809","Type":"ContainerStarted","Data":"454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc"} Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.860143 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6v9wc" event={"ID":"a49be6ea-3e0f-4975-89ff-9cfd022e4980","Type":"ContainerDied","Data":"8977a38cbaaf32eed51d32da6ec27ec80d42ffebe5949c913ded16c3750bfca1"} Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.860177 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8977a38cbaaf32eed51d32da6ec27ec80d42ffebe5949c913ded16c3750bfca1" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.860385 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6v9wc" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.860412 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:50 crc kubenswrapper[4911]: I0310 14:22:50.900710 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" podStartSLOduration=2.900687202 podStartE2EDuration="2.900687202s" podCreationTimestamp="2026-03-10 14:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:50.896378984 +0000 UTC m=+1275.459898901" watchObservedRunningTime="2026-03-10 14:22:50.900687202 +0000 UTC m=+1275.464207119" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.320289 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4rfzv"] Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.359972 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kbjd8"] Mar 10 14:22:51 crc kubenswrapper[4911]: E0310 14:22:51.360388 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49be6ea-3e0f-4975-89ff-9cfd022e4980" containerName="glance-db-sync" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.360409 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49be6ea-3e0f-4975-89ff-9cfd022e4980" containerName="glance-db-sync" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.360626 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49be6ea-3e0f-4975-89ff-9cfd022e4980" containerName="glance-db-sync" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.361583 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.380217 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kbjd8"] Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.463250 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.463347 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.463453 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-config\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.463496 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2br5g\" (UniqueName: \"kubernetes.io/projected/244b08cc-4c8e-453c-a04c-19972742b5ca-kube-api-access-2br5g\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.463521 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.463543 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.565463 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.565634 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.565700 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.565751 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-config\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.565797 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2br5g\" (UniqueName: \"kubernetes.io/projected/244b08cc-4c8e-453c-a04c-19972742b5ca-kube-api-access-2br5g\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.565824 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.566804 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.566959 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-config\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.566967 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.567021 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.567214 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.589002 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2br5g\" (UniqueName: \"kubernetes.io/projected/244b08cc-4c8e-453c-a04c-19972742b5ca-kube-api-access-2br5g\") pod \"dnsmasq-dns-74f6bcbc87-kbjd8\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:51 crc kubenswrapper[4911]: I0310 14:22:51.680423 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:52 crc kubenswrapper[4911]: I0310 14:22:52.110940 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kbjd8"] Mar 10 14:22:52 crc kubenswrapper[4911]: W0310 14:22:52.111565 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod244b08cc_4c8e_453c_a04c_19972742b5ca.slice/crio-b9c34c15df9f6b64bdab10edddc533c76e7bc03dc5725ab96f7f5ac248705368 WatchSource:0}: Error finding container b9c34c15df9f6b64bdab10edddc533c76e7bc03dc5725ab96f7f5ac248705368: Status 404 returned error can't find the container with id b9c34c15df9f6b64bdab10edddc533c76e7bc03dc5725ab96f7f5ac248705368 Mar 10 14:22:52 crc kubenswrapper[4911]: I0310 14:22:52.141076 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 10 14:22:52 crc kubenswrapper[4911]: I0310 14:22:52.882933 4911 generic.go:334] "Generic (PLEG): container finished" podID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerID="bfd1dce16419efeb13bbeaf36197ce64a637857c033483352cb3414a48fe8782" exitCode=0 Mar 10 14:22:52 crc kubenswrapper[4911]: I0310 14:22:52.883067 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" event={"ID":"244b08cc-4c8e-453c-a04c-19972742b5ca","Type":"ContainerDied","Data":"bfd1dce16419efeb13bbeaf36197ce64a637857c033483352cb3414a48fe8782"} Mar 10 14:22:52 crc kubenswrapper[4911]: I0310 14:22:52.883284 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" event={"ID":"244b08cc-4c8e-453c-a04c-19972742b5ca","Type":"ContainerStarted","Data":"b9c34c15df9f6b64bdab10edddc533c76e7bc03dc5725ab96f7f5ac248705368"} Mar 10 14:22:52 crc kubenswrapper[4911]: I0310 14:22:52.883449 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" podUID="e6cf2588-011c-44f8-a24c-984eb3b74809" containerName="dnsmasq-dns" containerID="cri-o://454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc" gracePeriod=10 Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.393016 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.501021 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-config\") pod \"e6cf2588-011c-44f8-a24c-984eb3b74809\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.501091 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-svc\") pod \"e6cf2588-011c-44f8-a24c-984eb3b74809\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.501110 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-sb\") pod \"e6cf2588-011c-44f8-a24c-984eb3b74809\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.501173 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxdt5\" (UniqueName: \"kubernetes.io/projected/e6cf2588-011c-44f8-a24c-984eb3b74809-kube-api-access-bxdt5\") pod \"e6cf2588-011c-44f8-a24c-984eb3b74809\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.501256 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-swift-storage-0\") pod \"e6cf2588-011c-44f8-a24c-984eb3b74809\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.501354 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-nb\") pod \"e6cf2588-011c-44f8-a24c-984eb3b74809\" (UID: \"e6cf2588-011c-44f8-a24c-984eb3b74809\") " Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.514136 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cf2588-011c-44f8-a24c-984eb3b74809-kube-api-access-bxdt5" (OuterVolumeSpecName: "kube-api-access-bxdt5") pod "e6cf2588-011c-44f8-a24c-984eb3b74809" (UID: "e6cf2588-011c-44f8-a24c-984eb3b74809"). InnerVolumeSpecName "kube-api-access-bxdt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.551760 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6cf2588-011c-44f8-a24c-984eb3b74809" (UID: "e6cf2588-011c-44f8-a24c-984eb3b74809"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.554987 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e6cf2588-011c-44f8-a24c-984eb3b74809" (UID: "e6cf2588-011c-44f8-a24c-984eb3b74809"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.556519 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6cf2588-011c-44f8-a24c-984eb3b74809" (UID: "e6cf2588-011c-44f8-a24c-984eb3b74809"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.558450 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e6cf2588-011c-44f8-a24c-984eb3b74809" (UID: "e6cf2588-011c-44f8-a24c-984eb3b74809"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.559229 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-config" (OuterVolumeSpecName: "config") pod "e6cf2588-011c-44f8-a24c-984eb3b74809" (UID: "e6cf2588-011c-44f8-a24c-984eb3b74809"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.603795 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.603827 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.603839 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxdt5\" (UniqueName: \"kubernetes.io/projected/e6cf2588-011c-44f8-a24c-984eb3b74809-kube-api-access-bxdt5\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.603854 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.603864 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.603872 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6cf2588-011c-44f8-a24c-984eb3b74809-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.896288 4911 generic.go:334] "Generic (PLEG): container finished" podID="e6cf2588-011c-44f8-a24c-984eb3b74809" containerID="454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc" exitCode=0 Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.896379 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" event={"ID":"e6cf2588-011c-44f8-a24c-984eb3b74809","Type":"ContainerDied","Data":"454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc"} Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.896697 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" event={"ID":"e6cf2588-011c-44f8-a24c-984eb3b74809","Type":"ContainerDied","Data":"ec40226046195e6afee2c2d868ab4b723d8218ba4d1dd60447474c7766210374"} Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.896408 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4rfzv" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.896743 4911 scope.go:117] "RemoveContainer" containerID="454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.903148 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" event={"ID":"244b08cc-4c8e-453c-a04c-19972742b5ca","Type":"ContainerStarted","Data":"f7582e25553c29861572627ca6db36fb8b1f3f33de8075082558cdb6b996be71"} Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.903433 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.929287 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" podStartSLOduration=2.929260519 podStartE2EDuration="2.929260519s" podCreationTimestamp="2026-03-10 14:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:22:53.920973792 +0000 UTC m=+1278.484493709" watchObservedRunningTime="2026-03-10 14:22:53.929260519 +0000 UTC m=+1278.492780436" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.935320 4911 scope.go:117] "RemoveContainer" containerID="5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.940828 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4rfzv"] Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.947578 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4rfzv"] Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.952280 4911 scope.go:117] "RemoveContainer" containerID="454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc" Mar 10 14:22:53 crc kubenswrapper[4911]: E0310 14:22:53.952807 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc\": container with ID starting with 454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc not found: ID does not exist" containerID="454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.952954 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc"} err="failed to get container status \"454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc\": rpc error: code = NotFound desc = could not find container \"454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc\": container with ID starting with 454b8316621e4a8348f2e3ac641028fdaabaaa43dfa8c521b9c4ec31d717fcdc not found: ID does not exist" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.953058 4911 scope.go:117] "RemoveContainer" containerID="5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6" Mar 10 14:22:53 crc kubenswrapper[4911]: E0310 14:22:53.953734 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6\": container with ID starting with 5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6 not found: ID does not exist" containerID="5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6" Mar 10 14:22:53 crc kubenswrapper[4911]: I0310 14:22:53.953772 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6"} err="failed to get container status \"5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6\": rpc error: code = NotFound desc = could not find container \"5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6\": container with ID starting with 5ac13a52ffc8cb391418f6ab859c18a9da5f0d495f9cc6a6eba0dd5ce95889e6 not found: ID does not exist" Mar 10 14:22:54 crc kubenswrapper[4911]: I0310 14:22:54.203589 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cf2588-011c-44f8-a24c-984eb3b74809" path="/var/lib/kubelet/pods/e6cf2588-011c-44f8-a24c-984eb3b74809/volumes" Mar 10 14:23:01 crc kubenswrapper[4911]: I0310 14:23:01.681869 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:23:01 crc kubenswrapper[4911]: I0310 14:23:01.765017 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:23:01 crc kubenswrapper[4911]: I0310 14:23:01.794355 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5xq5d"] Mar 10 14:23:01 crc kubenswrapper[4911]: I0310 14:23:01.794698 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-5xq5d" podUID="5b96432c-00a6-4109-ad5a-5e81eedb4611" containerName="dnsmasq-dns" containerID="cri-o://088ce3e316098d69cf9687cae329d7b74b1d202795beb964670796cdb096419f" gracePeriod=10 Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.007682 4911 generic.go:334] "Generic (PLEG): container finished" podID="5b96432c-00a6-4109-ad5a-5e81eedb4611" containerID="088ce3e316098d69cf9687cae329d7b74b1d202795beb964670796cdb096419f" exitCode=0 Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.007758 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5xq5d" event={"ID":"5b96432c-00a6-4109-ad5a-5e81eedb4611","Type":"ContainerDied","Data":"088ce3e316098d69cf9687cae329d7b74b1d202795beb964670796cdb096419f"} Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.144129 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.305521 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.392328 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95t9z\" (UniqueName: \"kubernetes.io/projected/5b96432c-00a6-4109-ad5a-5e81eedb4611-kube-api-access-95t9z\") pod \"5b96432c-00a6-4109-ad5a-5e81eedb4611\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.392381 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-dns-svc\") pod \"5b96432c-00a6-4109-ad5a-5e81eedb4611\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.392425 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-config\") pod \"5b96432c-00a6-4109-ad5a-5e81eedb4611\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.392457 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-nb\") pod \"5b96432c-00a6-4109-ad5a-5e81eedb4611\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.392594 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-sb\") pod \"5b96432c-00a6-4109-ad5a-5e81eedb4611\" (UID: \"5b96432c-00a6-4109-ad5a-5e81eedb4611\") " Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.398583 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b96432c-00a6-4109-ad5a-5e81eedb4611-kube-api-access-95t9z" (OuterVolumeSpecName: "kube-api-access-95t9z") pod "5b96432c-00a6-4109-ad5a-5e81eedb4611" (UID: "5b96432c-00a6-4109-ad5a-5e81eedb4611"). InnerVolumeSpecName "kube-api-access-95t9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.438666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b96432c-00a6-4109-ad5a-5e81eedb4611" (UID: "5b96432c-00a6-4109-ad5a-5e81eedb4611"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.449977 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b96432c-00a6-4109-ad5a-5e81eedb4611" (UID: "5b96432c-00a6-4109-ad5a-5e81eedb4611"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.452164 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b96432c-00a6-4109-ad5a-5e81eedb4611" (UID: "5b96432c-00a6-4109-ad5a-5e81eedb4611"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.456020 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-config" (OuterVolumeSpecName: "config") pod "5b96432c-00a6-4109-ad5a-5e81eedb4611" (UID: "5b96432c-00a6-4109-ad5a-5e81eedb4611"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.495078 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.495122 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95t9z\" (UniqueName: \"kubernetes.io/projected/5b96432c-00a6-4109-ad5a-5e81eedb4611-kube-api-access-95t9z\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.495140 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.495153 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:02 crc kubenswrapper[4911]: I0310 14:23:02.495164 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b96432c-00a6-4109-ad5a-5e81eedb4611-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.016893 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5xq5d" event={"ID":"5b96432c-00a6-4109-ad5a-5e81eedb4611","Type":"ContainerDied","Data":"6320da9ff5e4181b78b3062ca71e9dcbc4cc06374c60a57fa28baa6e615942c1"} Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.017302 4911 scope.go:117] "RemoveContainer" containerID="088ce3e316098d69cf9687cae329d7b74b1d202795beb964670796cdb096419f" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.017507 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5xq5d" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.044865 4911 scope.go:117] "RemoveContainer" containerID="5ad281a7a8c2fb6cf9f522fb9d5879cceccde5d1bc94a6b2335256689af30165" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.059804 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5xq5d"] Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.076855 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5xq5d"] Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.862149 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-vsqwc"] Mar 10 14:23:03 crc kubenswrapper[4911]: E0310 14:23:03.862577 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cf2588-011c-44f8-a24c-984eb3b74809" containerName="dnsmasq-dns" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.862595 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cf2588-011c-44f8-a24c-984eb3b74809" containerName="dnsmasq-dns" Mar 10 14:23:03 crc kubenswrapper[4911]: E0310 14:23:03.862625 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b96432c-00a6-4109-ad5a-5e81eedb4611" containerName="init" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.862632 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b96432c-00a6-4109-ad5a-5e81eedb4611" containerName="init" Mar 10 14:23:03 crc kubenswrapper[4911]: E0310 14:23:03.862644 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cf2588-011c-44f8-a24c-984eb3b74809" containerName="init" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.862650 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cf2588-011c-44f8-a24c-984eb3b74809" containerName="init" Mar 10 14:23:03 crc kubenswrapper[4911]: E0310 14:23:03.862666 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b96432c-00a6-4109-ad5a-5e81eedb4611" containerName="dnsmasq-dns" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.862672 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b96432c-00a6-4109-ad5a-5e81eedb4611" containerName="dnsmasq-dns" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.862865 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b96432c-00a6-4109-ad5a-5e81eedb4611" containerName="dnsmasq-dns" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.862888 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cf2588-011c-44f8-a24c-984eb3b74809" containerName="dnsmasq-dns" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.863508 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.887671 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vsqwc"] Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.919612 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjc5\" (UniqueName: \"kubernetes.io/projected/be4f56e0-2bd6-482b-a147-9fccecd2aecc-kube-api-access-8jjc5\") pod \"cinder-db-create-vsqwc\" (UID: \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\") " pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:03 crc kubenswrapper[4911]: I0310 14:23:03.919986 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4f56e0-2bd6-482b-a147-9fccecd2aecc-operator-scripts\") pod \"cinder-db-create-vsqwc\" (UID: \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\") " pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.021394 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4f56e0-2bd6-482b-a147-9fccecd2aecc-operator-scripts\") pod \"cinder-db-create-vsqwc\" (UID: \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\") " pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.021774 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjc5\" (UniqueName: \"kubernetes.io/projected/be4f56e0-2bd6-482b-a147-9fccecd2aecc-kube-api-access-8jjc5\") pod \"cinder-db-create-vsqwc\" (UID: \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\") " pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.022325 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4f56e0-2bd6-482b-a147-9fccecd2aecc-operator-scripts\") pod \"cinder-db-create-vsqwc\" (UID: \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\") " pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.039956 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3ba5-account-create-update-vrg4k"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.041044 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.043358 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.066523 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3ba5-account-create-update-vrg4k"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.068777 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjc5\" (UniqueName: \"kubernetes.io/projected/be4f56e0-2bd6-482b-a147-9fccecd2aecc-kube-api-access-8jjc5\") pod \"cinder-db-create-vsqwc\" (UID: \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\") " pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.123214 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b5ds\" (UniqueName: \"kubernetes.io/projected/9e568bb4-835b-4478-a92e-c8a2bd91f48c-kube-api-access-5b5ds\") pod \"cinder-3ba5-account-create-update-vrg4k\" (UID: \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\") " pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.123363 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e568bb4-835b-4478-a92e-c8a2bd91f48c-operator-scripts\") pod \"cinder-3ba5-account-create-update-vrg4k\" (UID: \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\") " pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.174924 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h9vhh"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.176144 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.189388 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h9vhh"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.194224 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.225359 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b96432c-00a6-4109-ad5a-5e81eedb4611" path="/var/lib/kubelet/pods/5b96432c-00a6-4109-ad5a-5e81eedb4611/volumes" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.229789 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b5ds\" (UniqueName: \"kubernetes.io/projected/9e568bb4-835b-4478-a92e-c8a2bd91f48c-kube-api-access-5b5ds\") pod \"cinder-3ba5-account-create-update-vrg4k\" (UID: \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\") " pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.229888 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e568bb4-835b-4478-a92e-c8a2bd91f48c-operator-scripts\") pod \"cinder-3ba5-account-create-update-vrg4k\" (UID: \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\") " pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.234035 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e568bb4-835b-4478-a92e-c8a2bd91f48c-operator-scripts\") pod \"cinder-3ba5-account-create-update-vrg4k\" (UID: \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\") " pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.276098 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b5ds\" (UniqueName: \"kubernetes.io/projected/9e568bb4-835b-4478-a92e-c8a2bd91f48c-kube-api-access-5b5ds\") pod \"cinder-3ba5-account-create-update-vrg4k\" (UID: \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\") " pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.282630 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gq52w"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.284039 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.331795 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hs9pl"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.333268 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.335257 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-operator-scripts\") pod \"barbican-db-create-h9vhh\" (UID: \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\") " pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.335347 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qljfl\" (UniqueName: \"kubernetes.io/projected/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-kube-api-access-qljfl\") pod \"barbican-db-create-h9vhh\" (UID: \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\") " pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.335665 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.335981 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.336220 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.336389 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4vwm6" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.344534 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hs9pl"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.356625 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gq52w"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.358016 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.390525 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9400-account-create-update-bl6w7"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.391974 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.396759 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.415986 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9400-account-create-update-bl6w7"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.439468 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-config-data\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.439549 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/316f04a5-be26-4b2b-beb5-54c8851c589f-operator-scripts\") pod \"neutron-db-create-gq52w\" (UID: \"316f04a5-be26-4b2b-beb5-54c8851c589f\") " pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.439600 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6gqd\" (UniqueName: \"kubernetes.io/projected/316f04a5-be26-4b2b-beb5-54c8851c589f-kube-api-access-n6gqd\") pod \"neutron-db-create-gq52w\" (UID: \"316f04a5-be26-4b2b-beb5-54c8851c589f\") " pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.439635 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-operator-scripts\") pod \"barbican-db-create-h9vhh\" (UID: \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\") " pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.439672 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qljfl\" (UniqueName: \"kubernetes.io/projected/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-kube-api-access-qljfl\") pod \"barbican-db-create-h9vhh\" (UID: \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\") " pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.439783 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-combined-ca-bundle\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.439832 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5lxz\" (UniqueName: \"kubernetes.io/projected/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-kube-api-access-r5lxz\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.441050 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-operator-scripts\") pod \"barbican-db-create-h9vhh\" (UID: \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\") " pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.491342 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qljfl\" (UniqueName: \"kubernetes.io/projected/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-kube-api-access-qljfl\") pod \"barbican-db-create-h9vhh\" (UID: \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\") " pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.494841 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.495325 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3438-account-create-update-6zd8p"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.496750 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.506037 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.537079 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3438-account-create-update-6zd8p"] Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.542238 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-config-data\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.542324 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/316f04a5-be26-4b2b-beb5-54c8851c589f-operator-scripts\") pod \"neutron-db-create-gq52w\" (UID: \"316f04a5-be26-4b2b-beb5-54c8851c589f\") " pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.542385 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6gqd\" (UniqueName: \"kubernetes.io/projected/316f04a5-be26-4b2b-beb5-54c8851c589f-kube-api-access-n6gqd\") pod \"neutron-db-create-gq52w\" (UID: \"316f04a5-be26-4b2b-beb5-54c8851c589f\") " pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.542481 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-combined-ca-bundle\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.542506 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5lxz\" (UniqueName: \"kubernetes.io/projected/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-kube-api-access-r5lxz\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.542576 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcn9q\" (UniqueName: \"kubernetes.io/projected/f420a99e-9a7a-402f-abef-e299c12a33bc-kube-api-access-rcn9q\") pod \"barbican-9400-account-create-update-bl6w7\" (UID: \"f420a99e-9a7a-402f-abef-e299c12a33bc\") " pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.542632 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f420a99e-9a7a-402f-abef-e299c12a33bc-operator-scripts\") pod \"barbican-9400-account-create-update-bl6w7\" (UID: \"f420a99e-9a7a-402f-abef-e299c12a33bc\") " pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.545794 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/316f04a5-be26-4b2b-beb5-54c8851c589f-operator-scripts\") pod \"neutron-db-create-gq52w\" (UID: \"316f04a5-be26-4b2b-beb5-54c8851c589f\") " pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.548312 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-config-data\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.550935 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-combined-ca-bundle\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.562375 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6gqd\" (UniqueName: \"kubernetes.io/projected/316f04a5-be26-4b2b-beb5-54c8851c589f-kube-api-access-n6gqd\") pod \"neutron-db-create-gq52w\" (UID: \"316f04a5-be26-4b2b-beb5-54c8851c589f\") " pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.566997 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5lxz\" (UniqueName: \"kubernetes.io/projected/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-kube-api-access-r5lxz\") pod \"keystone-db-sync-hs9pl\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.644617 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b856dbd6-5849-42aa-a88a-942fd434e4cd-operator-scripts\") pod \"neutron-3438-account-create-update-6zd8p\" (UID: \"b856dbd6-5849-42aa-a88a-942fd434e4cd\") " pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.644856 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcn9q\" (UniqueName: \"kubernetes.io/projected/f420a99e-9a7a-402f-abef-e299c12a33bc-kube-api-access-rcn9q\") pod \"barbican-9400-account-create-update-bl6w7\" (UID: \"f420a99e-9a7a-402f-abef-e299c12a33bc\") " pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.644881 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25x5w\" (UniqueName: \"kubernetes.io/projected/b856dbd6-5849-42aa-a88a-942fd434e4cd-kube-api-access-25x5w\") pod \"neutron-3438-account-create-update-6zd8p\" (UID: \"b856dbd6-5849-42aa-a88a-942fd434e4cd\") " pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.644903 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f420a99e-9a7a-402f-abef-e299c12a33bc-operator-scripts\") pod \"barbican-9400-account-create-update-bl6w7\" (UID: \"f420a99e-9a7a-402f-abef-e299c12a33bc\") " pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.646436 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f420a99e-9a7a-402f-abef-e299c12a33bc-operator-scripts\") pod \"barbican-9400-account-create-update-bl6w7\" (UID: \"f420a99e-9a7a-402f-abef-e299c12a33bc\") " pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.668025 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcn9q\" (UniqueName: \"kubernetes.io/projected/f420a99e-9a7a-402f-abef-e299c12a33bc-kube-api-access-rcn9q\") pod \"barbican-9400-account-create-update-bl6w7\" (UID: \"f420a99e-9a7a-402f-abef-e299c12a33bc\") " pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.678631 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.691628 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.759297 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25x5w\" (UniqueName: \"kubernetes.io/projected/b856dbd6-5849-42aa-a88a-942fd434e4cd-kube-api-access-25x5w\") pod \"neutron-3438-account-create-update-6zd8p\" (UID: \"b856dbd6-5849-42aa-a88a-942fd434e4cd\") " pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.759437 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b856dbd6-5849-42aa-a88a-942fd434e4cd-operator-scripts\") pod \"neutron-3438-account-create-update-6zd8p\" (UID: \"b856dbd6-5849-42aa-a88a-942fd434e4cd\") " pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.760720 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b856dbd6-5849-42aa-a88a-942fd434e4cd-operator-scripts\") pod \"neutron-3438-account-create-update-6zd8p\" (UID: \"b856dbd6-5849-42aa-a88a-942fd434e4cd\") " pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.778804 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25x5w\" (UniqueName: \"kubernetes.io/projected/b856dbd6-5849-42aa-a88a-942fd434e4cd-kube-api-access-25x5w\") pod \"neutron-3438-account-create-update-6zd8p\" (UID: \"b856dbd6-5849-42aa-a88a-942fd434e4cd\") " pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.801944 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.838096 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:04 crc kubenswrapper[4911]: I0310 14:23:04.921888 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vsqwc"] Mar 10 14:23:05 crc kubenswrapper[4911]: W0310 14:23:05.033512 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e568bb4_835b_4478_a92e_c8a2bd91f48c.slice/crio-a942ffce74ad08de55b4080ee82955c991e1bd9e57a2ad351400add5c388d17e WatchSource:0}: Error finding container a942ffce74ad08de55b4080ee82955c991e1bd9e57a2ad351400add5c388d17e: Status 404 returned error can't find the container with id a942ffce74ad08de55b4080ee82955c991e1bd9e57a2ad351400add5c388d17e Mar 10 14:23:05 crc kubenswrapper[4911]: I0310 14:23:05.037259 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3ba5-account-create-update-vrg4k"] Mar 10 14:23:05 crc kubenswrapper[4911]: I0310 14:23:05.038685 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vsqwc" event={"ID":"be4f56e0-2bd6-482b-a147-9fccecd2aecc","Type":"ContainerStarted","Data":"b980a63adb49ed08374db1410f03a1e627216c5245c545446beec03e917b82a1"} Mar 10 14:23:05 crc kubenswrapper[4911]: I0310 14:23:05.117351 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h9vhh"] Mar 10 14:23:05 crc kubenswrapper[4911]: I0310 14:23:05.344115 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gq52w"] Mar 10 14:23:05 crc kubenswrapper[4911]: W0310 14:23:05.377350 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa710f1a_fd18_49b5_bdd1_3afbe21047ec.slice/crio-d6d73172730227341a2b37efa272e3361ac8386d9cb5d327ddea364c1a2ddff6 WatchSource:0}: Error finding container d6d73172730227341a2b37efa272e3361ac8386d9cb5d327ddea364c1a2ddff6: Status 404 returned error can't find the container with id d6d73172730227341a2b37efa272e3361ac8386d9cb5d327ddea364c1a2ddff6 Mar 10 14:23:05 crc kubenswrapper[4911]: I0310 14:23:05.379186 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hs9pl"] Mar 10 14:23:05 crc kubenswrapper[4911]: I0310 14:23:05.471000 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3438-account-create-update-6zd8p"] Mar 10 14:23:05 crc kubenswrapper[4911]: I0310 14:23:05.484323 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9400-account-create-update-bl6w7"] Mar 10 14:23:05 crc kubenswrapper[4911]: W0310 14:23:05.509110 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf420a99e_9a7a_402f_abef_e299c12a33bc.slice/crio-bd1b83209b4b5d55807d4fcb996260d22fa7edb67c8f452d679247bb277d2be5 WatchSource:0}: Error finding container bd1b83209b4b5d55807d4fcb996260d22fa7edb67c8f452d679247bb277d2be5: Status 404 returned error can't find the container with id bd1b83209b4b5d55807d4fcb996260d22fa7edb67c8f452d679247bb277d2be5 Mar 10 14:23:05 crc kubenswrapper[4911]: W0310 14:23:05.510071 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb856dbd6_5849_42aa_a88a_942fd434e4cd.slice/crio-103da99e354784f21442eabe718dcda1298523b793d12b6de1e84cce1bef762a WatchSource:0}: Error finding container 103da99e354784f21442eabe718dcda1298523b793d12b6de1e84cce1bef762a: Status 404 returned error can't find the container with id 103da99e354784f21442eabe718dcda1298523b793d12b6de1e84cce1bef762a Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.051317 4911 generic.go:334] "Generic (PLEG): container finished" podID="316f04a5-be26-4b2b-beb5-54c8851c589f" containerID="88ffca46975943a274764ed341dbce2416b33f42d0a79fa7a271350471da528f" exitCode=0 Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.051762 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gq52w" event={"ID":"316f04a5-be26-4b2b-beb5-54c8851c589f","Type":"ContainerDied","Data":"88ffca46975943a274764ed341dbce2416b33f42d0a79fa7a271350471da528f"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.051811 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gq52w" event={"ID":"316f04a5-be26-4b2b-beb5-54c8851c589f","Type":"ContainerStarted","Data":"62c328ef49618e31f3196b4a210bf4842e0dd0184768c952f34301dfb993b55c"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.054499 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hs9pl" event={"ID":"fa710f1a-fd18-49b5-bdd1-3afbe21047ec","Type":"ContainerStarted","Data":"d6d73172730227341a2b37efa272e3361ac8386d9cb5d327ddea364c1a2ddff6"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.056896 4911 generic.go:334] "Generic (PLEG): container finished" podID="9e568bb4-835b-4478-a92e-c8a2bd91f48c" containerID="38c1707684e3284b1a4f9941449161009abe645bb461871361585173ad792c23" exitCode=0 Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.056940 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3ba5-account-create-update-vrg4k" event={"ID":"9e568bb4-835b-4478-a92e-c8a2bd91f48c","Type":"ContainerDied","Data":"38c1707684e3284b1a4f9941449161009abe645bb461871361585173ad792c23"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.057016 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3ba5-account-create-update-vrg4k" event={"ID":"9e568bb4-835b-4478-a92e-c8a2bd91f48c","Type":"ContainerStarted","Data":"a942ffce74ad08de55b4080ee82955c991e1bd9e57a2ad351400add5c388d17e"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.062252 4911 generic.go:334] "Generic (PLEG): container finished" podID="4e171a5e-2cdb-458b-9fe3-aeb5cde435c4" containerID="8298867308da7df6b329a7316ecb7f689329f3d3eed6c906c0e9b44c2d7125ef" exitCode=0 Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.062355 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h9vhh" event={"ID":"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4","Type":"ContainerDied","Data":"8298867308da7df6b329a7316ecb7f689329f3d3eed6c906c0e9b44c2d7125ef"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.062390 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h9vhh" event={"ID":"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4","Type":"ContainerStarted","Data":"c19f521d1312e4cc66e861ee9bb45e994f13cde8d28a5a095debc064fface65a"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.065733 4911 generic.go:334] "Generic (PLEG): container finished" podID="b856dbd6-5849-42aa-a88a-942fd434e4cd" containerID="bbefc3fbf5583dfe5be76c340bf542f4715b5abaac8cd374f710ec6137a24e23" exitCode=0 Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.065811 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3438-account-create-update-6zd8p" event={"ID":"b856dbd6-5849-42aa-a88a-942fd434e4cd","Type":"ContainerDied","Data":"bbefc3fbf5583dfe5be76c340bf542f4715b5abaac8cd374f710ec6137a24e23"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.065840 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3438-account-create-update-6zd8p" event={"ID":"b856dbd6-5849-42aa-a88a-942fd434e4cd","Type":"ContainerStarted","Data":"103da99e354784f21442eabe718dcda1298523b793d12b6de1e84cce1bef762a"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.074661 4911 generic.go:334] "Generic (PLEG): container finished" podID="be4f56e0-2bd6-482b-a147-9fccecd2aecc" containerID="2bd551f48cab288b0d0fb2e8adb031410c098f214687834958bc3337ddfab601" exitCode=0 Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.074749 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vsqwc" event={"ID":"be4f56e0-2bd6-482b-a147-9fccecd2aecc","Type":"ContainerDied","Data":"2bd551f48cab288b0d0fb2e8adb031410c098f214687834958bc3337ddfab601"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.079425 4911 generic.go:334] "Generic (PLEG): container finished" podID="f420a99e-9a7a-402f-abef-e299c12a33bc" containerID="80b7f93e162fca480a7790b54ad1f4f967f99683aaa9320f6c7cd2223ef3704a" exitCode=0 Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.079641 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9400-account-create-update-bl6w7" event={"ID":"f420a99e-9a7a-402f-abef-e299c12a33bc","Type":"ContainerDied","Data":"80b7f93e162fca480a7790b54ad1f4f967f99683aaa9320f6c7cd2223ef3704a"} Mar 10 14:23:06 crc kubenswrapper[4911]: I0310 14:23:06.079740 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9400-account-create-update-bl6w7" event={"ID":"f420a99e-9a7a-402f-abef-e299c12a33bc","Type":"ContainerStarted","Data":"bd1b83209b4b5d55807d4fcb996260d22fa7edb67c8f452d679247bb277d2be5"} Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.117382 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vsqwc" event={"ID":"be4f56e0-2bd6-482b-a147-9fccecd2aecc","Type":"ContainerDied","Data":"b980a63adb49ed08374db1410f03a1e627216c5245c545446beec03e917b82a1"} Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.118820 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b980a63adb49ed08374db1410f03a1e627216c5245c545446beec03e917b82a1" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.119875 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9400-account-create-update-bl6w7" event={"ID":"f420a99e-9a7a-402f-abef-e299c12a33bc","Type":"ContainerDied","Data":"bd1b83209b4b5d55807d4fcb996260d22fa7edb67c8f452d679247bb277d2be5"} Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.119901 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd1b83209b4b5d55807d4fcb996260d22fa7edb67c8f452d679247bb277d2be5" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.122015 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gq52w" event={"ID":"316f04a5-be26-4b2b-beb5-54c8851c589f","Type":"ContainerDied","Data":"62c328ef49618e31f3196b4a210bf4842e0dd0184768c952f34301dfb993b55c"} Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.122037 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62c328ef49618e31f3196b4a210bf4842e0dd0184768c952f34301dfb993b55c" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.124258 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3ba5-account-create-update-vrg4k" event={"ID":"9e568bb4-835b-4478-a92e-c8a2bd91f48c","Type":"ContainerDied","Data":"a942ffce74ad08de55b4080ee82955c991e1bd9e57a2ad351400add5c388d17e"} Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.124275 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a942ffce74ad08de55b4080ee82955c991e1bd9e57a2ad351400add5c388d17e" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.126349 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h9vhh" event={"ID":"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4","Type":"ContainerDied","Data":"c19f521d1312e4cc66e861ee9bb45e994f13cde8d28a5a095debc064fface65a"} Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.126440 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c19f521d1312e4cc66e861ee9bb45e994f13cde8d28a5a095debc064fface65a" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.128564 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3438-account-create-update-6zd8p" event={"ID":"b856dbd6-5849-42aa-a88a-942fd434e4cd","Type":"ContainerDied","Data":"103da99e354784f21442eabe718dcda1298523b793d12b6de1e84cce1bef762a"} Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.128601 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="103da99e354784f21442eabe718dcda1298523b793d12b6de1e84cce1bef762a" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.133193 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.156760 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.187966 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.198505 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.220802 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.263088 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.302761 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jjc5\" (UniqueName: \"kubernetes.io/projected/be4f56e0-2bd6-482b-a147-9fccecd2aecc-kube-api-access-8jjc5\") pod \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\" (UID: \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.302813 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6gqd\" (UniqueName: \"kubernetes.io/projected/316f04a5-be26-4b2b-beb5-54c8851c589f-kube-api-access-n6gqd\") pod \"316f04a5-be26-4b2b-beb5-54c8851c589f\" (UID: \"316f04a5-be26-4b2b-beb5-54c8851c589f\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.302887 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/316f04a5-be26-4b2b-beb5-54c8851c589f-operator-scripts\") pod \"316f04a5-be26-4b2b-beb5-54c8851c589f\" (UID: \"316f04a5-be26-4b2b-beb5-54c8851c589f\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.302927 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-operator-scripts\") pod \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\" (UID: \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.302976 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e568bb4-835b-4478-a92e-c8a2bd91f48c-operator-scripts\") pod \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\" (UID: \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.303018 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b5ds\" (UniqueName: \"kubernetes.io/projected/9e568bb4-835b-4478-a92e-c8a2bd91f48c-kube-api-access-5b5ds\") pod \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\" (UID: \"9e568bb4-835b-4478-a92e-c8a2bd91f48c\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.303033 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qljfl\" (UniqueName: \"kubernetes.io/projected/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-kube-api-access-qljfl\") pod \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\" (UID: \"4e171a5e-2cdb-458b-9fe3-aeb5cde435c4\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.303064 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4f56e0-2bd6-482b-a147-9fccecd2aecc-operator-scripts\") pod \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\" (UID: \"be4f56e0-2bd6-482b-a147-9fccecd2aecc\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.304195 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316f04a5-be26-4b2b-beb5-54c8851c589f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "316f04a5-be26-4b2b-beb5-54c8851c589f" (UID: "316f04a5-be26-4b2b-beb5-54c8851c589f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.304203 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e568bb4-835b-4478-a92e-c8a2bd91f48c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e568bb4-835b-4478-a92e-c8a2bd91f48c" (UID: "9e568bb4-835b-4478-a92e-c8a2bd91f48c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.304519 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be4f56e0-2bd6-482b-a147-9fccecd2aecc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be4f56e0-2bd6-482b-a147-9fccecd2aecc" (UID: "be4f56e0-2bd6-482b-a147-9fccecd2aecc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.304765 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e171a5e-2cdb-458b-9fe3-aeb5cde435c4" (UID: "4e171a5e-2cdb-458b-9fe3-aeb5cde435c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.307261 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4f56e0-2bd6-482b-a147-9fccecd2aecc-kube-api-access-8jjc5" (OuterVolumeSpecName: "kube-api-access-8jjc5") pod "be4f56e0-2bd6-482b-a147-9fccecd2aecc" (UID: "be4f56e0-2bd6-482b-a147-9fccecd2aecc"). InnerVolumeSpecName "kube-api-access-8jjc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.308421 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316f04a5-be26-4b2b-beb5-54c8851c589f-kube-api-access-n6gqd" (OuterVolumeSpecName: "kube-api-access-n6gqd") pod "316f04a5-be26-4b2b-beb5-54c8851c589f" (UID: "316f04a5-be26-4b2b-beb5-54c8851c589f"). InnerVolumeSpecName "kube-api-access-n6gqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.308944 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-kube-api-access-qljfl" (OuterVolumeSpecName: "kube-api-access-qljfl") pod "4e171a5e-2cdb-458b-9fe3-aeb5cde435c4" (UID: "4e171a5e-2cdb-458b-9fe3-aeb5cde435c4"). InnerVolumeSpecName "kube-api-access-qljfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.309460 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e568bb4-835b-4478-a92e-c8a2bd91f48c-kube-api-access-5b5ds" (OuterVolumeSpecName: "kube-api-access-5b5ds") pod "9e568bb4-835b-4478-a92e-c8a2bd91f48c" (UID: "9e568bb4-835b-4478-a92e-c8a2bd91f48c"). InnerVolumeSpecName "kube-api-access-5b5ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.404690 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25x5w\" (UniqueName: \"kubernetes.io/projected/b856dbd6-5849-42aa-a88a-942fd434e4cd-kube-api-access-25x5w\") pod \"b856dbd6-5849-42aa-a88a-942fd434e4cd\" (UID: \"b856dbd6-5849-42aa-a88a-942fd434e4cd\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.404840 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcn9q\" (UniqueName: \"kubernetes.io/projected/f420a99e-9a7a-402f-abef-e299c12a33bc-kube-api-access-rcn9q\") pod \"f420a99e-9a7a-402f-abef-e299c12a33bc\" (UID: \"f420a99e-9a7a-402f-abef-e299c12a33bc\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.404942 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f420a99e-9a7a-402f-abef-e299c12a33bc-operator-scripts\") pod \"f420a99e-9a7a-402f-abef-e299c12a33bc\" (UID: \"f420a99e-9a7a-402f-abef-e299c12a33bc\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.404969 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b856dbd6-5849-42aa-a88a-942fd434e4cd-operator-scripts\") pod \"b856dbd6-5849-42aa-a88a-942fd434e4cd\" (UID: \"b856dbd6-5849-42aa-a88a-942fd434e4cd\") " Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405466 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4f56e0-2bd6-482b-a147-9fccecd2aecc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405491 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jjc5\" (UniqueName: \"kubernetes.io/projected/be4f56e0-2bd6-482b-a147-9fccecd2aecc-kube-api-access-8jjc5\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405506 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6gqd\" (UniqueName: \"kubernetes.io/projected/316f04a5-be26-4b2b-beb5-54c8851c589f-kube-api-access-n6gqd\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405519 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/316f04a5-be26-4b2b-beb5-54c8851c589f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405531 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405542 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e568bb4-835b-4478-a92e-c8a2bd91f48c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405554 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qljfl\" (UniqueName: \"kubernetes.io/projected/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4-kube-api-access-qljfl\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405565 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b5ds\" (UniqueName: \"kubernetes.io/projected/9e568bb4-835b-4478-a92e-c8a2bd91f48c-kube-api-access-5b5ds\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405700 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b856dbd6-5849-42aa-a88a-942fd434e4cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b856dbd6-5849-42aa-a88a-942fd434e4cd" (UID: "b856dbd6-5849-42aa-a88a-942fd434e4cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.405756 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f420a99e-9a7a-402f-abef-e299c12a33bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f420a99e-9a7a-402f-abef-e299c12a33bc" (UID: "f420a99e-9a7a-402f-abef-e299c12a33bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.409102 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f420a99e-9a7a-402f-abef-e299c12a33bc-kube-api-access-rcn9q" (OuterVolumeSpecName: "kube-api-access-rcn9q") pod "f420a99e-9a7a-402f-abef-e299c12a33bc" (UID: "f420a99e-9a7a-402f-abef-e299c12a33bc"). InnerVolumeSpecName "kube-api-access-rcn9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.409242 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b856dbd6-5849-42aa-a88a-942fd434e4cd-kube-api-access-25x5w" (OuterVolumeSpecName: "kube-api-access-25x5w") pod "b856dbd6-5849-42aa-a88a-942fd434e4cd" (UID: "b856dbd6-5849-42aa-a88a-942fd434e4cd"). InnerVolumeSpecName "kube-api-access-25x5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.506900 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcn9q\" (UniqueName: \"kubernetes.io/projected/f420a99e-9a7a-402f-abef-e299c12a33bc-kube-api-access-rcn9q\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.506944 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f420a99e-9a7a-402f-abef-e299c12a33bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.506960 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b856dbd6-5849-42aa-a88a-942fd434e4cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:10 crc kubenswrapper[4911]: I0310 14:23:10.506972 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25x5w\" (UniqueName: \"kubernetes.io/projected/b856dbd6-5849-42aa-a88a-942fd434e4cd-kube-api-access-25x5w\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:11 crc kubenswrapper[4911]: I0310 14:23:11.139780 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gq52w" Mar 10 14:23:11 crc kubenswrapper[4911]: I0310 14:23:11.149614 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hs9pl" event={"ID":"fa710f1a-fd18-49b5-bdd1-3afbe21047ec","Type":"ContainerStarted","Data":"712a84bd650d353d290828d088e888703a58b95b23832a4b73e1ff00ca78cc71"} Mar 10 14:23:11 crc kubenswrapper[4911]: I0310 14:23:11.149648 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9400-account-create-update-bl6w7" Mar 10 14:23:11 crc kubenswrapper[4911]: I0310 14:23:11.149679 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3438-account-create-update-6zd8p" Mar 10 14:23:11 crc kubenswrapper[4911]: I0310 14:23:11.149733 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3ba5-account-create-update-vrg4k" Mar 10 14:23:11 crc kubenswrapper[4911]: I0310 14:23:11.149988 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h9vhh" Mar 10 14:23:11 crc kubenswrapper[4911]: I0310 14:23:11.149594 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vsqwc" Mar 10 14:23:11 crc kubenswrapper[4911]: I0310 14:23:11.197521 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hs9pl" podStartSLOduration=2.590941157 podStartE2EDuration="7.197484044s" podCreationTimestamp="2026-03-10 14:23:04 +0000 UTC" firstStartedPulling="2026-03-10 14:23:05.384974952 +0000 UTC m=+1289.948494869" lastFinishedPulling="2026-03-10 14:23:09.991517839 +0000 UTC m=+1294.555037756" observedRunningTime="2026-03-10 14:23:11.177954444 +0000 UTC m=+1295.741474401" watchObservedRunningTime="2026-03-10 14:23:11.197484044 +0000 UTC m=+1295.761003971" Mar 10 14:23:14 crc kubenswrapper[4911]: I0310 14:23:14.170785 4911 generic.go:334] "Generic (PLEG): container finished" podID="fa710f1a-fd18-49b5-bdd1-3afbe21047ec" containerID="712a84bd650d353d290828d088e888703a58b95b23832a4b73e1ff00ca78cc71" exitCode=0 Mar 10 14:23:14 crc kubenswrapper[4911]: I0310 14:23:14.170874 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hs9pl" event={"ID":"fa710f1a-fd18-49b5-bdd1-3afbe21047ec","Type":"ContainerDied","Data":"712a84bd650d353d290828d088e888703a58b95b23832a4b73e1ff00ca78cc71"} Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.535900 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.735151 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5lxz\" (UniqueName: \"kubernetes.io/projected/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-kube-api-access-r5lxz\") pod \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.735361 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-combined-ca-bundle\") pod \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.735428 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-config-data\") pod \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\" (UID: \"fa710f1a-fd18-49b5-bdd1-3afbe21047ec\") " Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.741733 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-kube-api-access-r5lxz" (OuterVolumeSpecName: "kube-api-access-r5lxz") pod "fa710f1a-fd18-49b5-bdd1-3afbe21047ec" (UID: "fa710f1a-fd18-49b5-bdd1-3afbe21047ec"). InnerVolumeSpecName "kube-api-access-r5lxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.760901 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa710f1a-fd18-49b5-bdd1-3afbe21047ec" (UID: "fa710f1a-fd18-49b5-bdd1-3afbe21047ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.784281 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-config-data" (OuterVolumeSpecName: "config-data") pod "fa710f1a-fd18-49b5-bdd1-3afbe21047ec" (UID: "fa710f1a-fd18-49b5-bdd1-3afbe21047ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.838178 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.838269 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:15 crc kubenswrapper[4911]: I0310 14:23:15.838280 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5lxz\" (UniqueName: \"kubernetes.io/projected/fa710f1a-fd18-49b5-bdd1-3afbe21047ec-kube-api-access-r5lxz\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.240549 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hs9pl" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.252543 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hs9pl" event={"ID":"fa710f1a-fd18-49b5-bdd1-3afbe21047ec","Type":"ContainerDied","Data":"d6d73172730227341a2b37efa272e3361ac8386d9cb5d327ddea364c1a2ddff6"} Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.252603 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6d73172730227341a2b37efa272e3361ac8386d9cb5d327ddea364c1a2ddff6" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.428215 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6w9g2"] Mar 10 14:23:16 crc kubenswrapper[4911]: E0310 14:23:16.428736 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e171a5e-2cdb-458b-9fe3-aeb5cde435c4" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.428764 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e171a5e-2cdb-458b-9fe3-aeb5cde435c4" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: E0310 14:23:16.428783 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e568bb4-835b-4478-a92e-c8a2bd91f48c" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.428791 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e568bb4-835b-4478-a92e-c8a2bd91f48c" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: E0310 14:23:16.428815 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4f56e0-2bd6-482b-a147-9fccecd2aecc" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.428823 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4f56e0-2bd6-482b-a147-9fccecd2aecc" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: E0310 14:23:16.428833 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f420a99e-9a7a-402f-abef-e299c12a33bc" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.428841 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f420a99e-9a7a-402f-abef-e299c12a33bc" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: E0310 14:23:16.428857 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa710f1a-fd18-49b5-bdd1-3afbe21047ec" containerName="keystone-db-sync" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.428863 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa710f1a-fd18-49b5-bdd1-3afbe21047ec" containerName="keystone-db-sync" Mar 10 14:23:16 crc kubenswrapper[4911]: E0310 14:23:16.428874 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b856dbd6-5849-42aa-a88a-942fd434e4cd" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.428881 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b856dbd6-5849-42aa-a88a-942fd434e4cd" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: E0310 14:23:16.428894 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316f04a5-be26-4b2b-beb5-54c8851c589f" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.428900 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="316f04a5-be26-4b2b-beb5-54c8851c589f" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.429096 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="316f04a5-be26-4b2b-beb5-54c8851c589f" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.429110 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b856dbd6-5849-42aa-a88a-942fd434e4cd" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.429121 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e171a5e-2cdb-458b-9fe3-aeb5cde435c4" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.429135 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f420a99e-9a7a-402f-abef-e299c12a33bc" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.429151 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4f56e0-2bd6-482b-a147-9fccecd2aecc" containerName="mariadb-database-create" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.429165 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e568bb4-835b-4478-a92e-c8a2bd91f48c" containerName="mariadb-account-create-update" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.429177 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa710f1a-fd18-49b5-bdd1-3afbe21047ec" containerName="keystone-db-sync" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.429963 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.432833 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.437105 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.437427 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.437473 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4vwm6" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.437640 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.460639 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6w9g2"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.483785 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lzxzm"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.485381 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.488830 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lzxzm"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.557943 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-config-data\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.558003 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-scripts\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.560870 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8xkb\" (UniqueName: \"kubernetes.io/projected/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-kube-api-access-n8xkb\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.560932 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-fernet-keys\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.560961 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-credential-keys\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.561056 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-combined-ca-bundle\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.614201 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b4fbb8c45-zw22m"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.616435 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.619100 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.623500 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.623706 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-65f4q" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.623971 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.643140 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b4fbb8c45-zw22m"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.661984 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-scripts\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662067 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc8w6\" (UniqueName: \"kubernetes.io/projected/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-kube-api-access-gc8w6\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662126 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsgzm\" (UniqueName: \"kubernetes.io/projected/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-kube-api-access-dsgzm\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662152 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-horizon-secret-key\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662197 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-scripts\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662217 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-svc\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662232 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662308 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-config-data\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662366 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8xkb\" (UniqueName: \"kubernetes.io/projected/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-kube-api-access-n8xkb\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662399 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-fernet-keys\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662476 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-credential-keys\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662559 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-combined-ca-bundle\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662653 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662690 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662763 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-config-data\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662820 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-logs\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.662847 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-config\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.673986 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-combined-ca-bundle\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.677132 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-scripts\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.681803 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-fernet-keys\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.683291 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-credential-keys\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.699547 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-config-data\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.710563 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8xkb\" (UniqueName: \"kubernetes.io/projected/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-kube-api-access-n8xkb\") pod \"keystone-bootstrap-6w9g2\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.744771 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-k62d9"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.746064 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.746919 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.755273 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-cg8dm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.755498 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.756013 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.758184 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.763886 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-scripts\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.763933 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-svc\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.763951 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.763983 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pfvx\" (UniqueName: \"kubernetes.io/projected/add8035b-c6a4-47d1-aa42-ed381ba87b11-kube-api-access-4pfvx\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764001 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764018 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-combined-ca-bundle\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764040 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6tjk\" (UniqueName: \"kubernetes.io/projected/9942f116-fd81-4e92-bd0f-add9b12b4c08-kube-api-access-k6tjk\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764061 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-scripts\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764078 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-config-data\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764102 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-config-data\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764128 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-scripts\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764150 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764167 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764194 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-logs\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764216 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-config\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764232 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-config-data\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764265 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764281 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-run-httpd\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764304 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc8w6\" (UniqueName: \"kubernetes.io/projected/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-kube-api-access-gc8w6\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764324 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-log-httpd\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764348 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9942f116-fd81-4e92-bd0f-add9b12b4c08-etc-machine-id\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764366 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgzm\" (UniqueName: \"kubernetes.io/projected/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-kube-api-access-dsgzm\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764389 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-db-sync-config-data\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.764406 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-horizon-secret-key\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.769990 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.770245 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.770439 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.771657 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.772138 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-scripts\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.772641 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-svc\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.777886 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.780994 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-config-data\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.782543 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.786470 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-config\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.788899 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-k62d9"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.791148 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-horizon-secret-key\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.793283 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-logs\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.810162 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.855999 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgzm\" (UniqueName: \"kubernetes.io/projected/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-kube-api-access-dsgzm\") pod \"dnsmasq-dns-847c4cc679-lzxzm\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866389 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-log-httpd\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866458 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9942f116-fd81-4e92-bd0f-add9b12b4c08-etc-machine-id\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866486 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-db-sync-config-data\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866528 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pfvx\" (UniqueName: \"kubernetes.io/projected/add8035b-c6a4-47d1-aa42-ed381ba87b11-kube-api-access-4pfvx\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866547 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866591 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-combined-ca-bundle\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866612 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6tjk\" (UniqueName: \"kubernetes.io/projected/9942f116-fd81-4e92-bd0f-add9b12b4c08-kube-api-access-k6tjk\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866651 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-scripts\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.866685 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-config-data\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.868764 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-log-httpd\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.873639 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-scripts\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.873823 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-config-data\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.873948 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.873983 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-run-httpd\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.874499 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bd5cf7c7f-sk89q"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.874545 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-run-httpd\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.874591 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9942f116-fd81-4e92-bd0f-add9b12b4c08-etc-machine-id\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.885821 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-db-sync-config-data\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.886402 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-scripts\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.892074 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-scripts\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.892567 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.893225 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.896928 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-combined-ca-bundle\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.904236 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-config-data\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.904795 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc8w6\" (UniqueName: \"kubernetes.io/projected/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-kube-api-access-gc8w6\") pod \"horizon-6b4fbb8c45-zw22m\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.905621 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-config-data\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.927411 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6tjk\" (UniqueName: \"kubernetes.io/projected/9942f116-fd81-4e92-bd0f-add9b12b4c08-kube-api-access-k6tjk\") pod \"cinder-db-sync-k62d9\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.947003 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pfvx\" (UniqueName: \"kubernetes.io/projected/add8035b-c6a4-47d1-aa42-ed381ba87b11-kube-api-access-4pfvx\") pod \"ceilometer-0\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " pod="openstack/ceilometer-0" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.948435 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.951453 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.966788 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bd5cf7c7f-sk89q"] Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.977701 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-config-data\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.977773 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-logs\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.977848 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6sks\" (UniqueName: \"kubernetes.io/projected/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-kube-api-access-g6sks\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.977874 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-scripts\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:16 crc kubenswrapper[4911]: I0310 14:23:16.977992 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-horizon-secret-key\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.012472 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-85swm"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.014025 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.029689 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t4gl8" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.030109 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.046088 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wk4q4"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.062914 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.076641 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s462l" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.077044 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.077187 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079330 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-config-data\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079382 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-logs\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079421 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-db-sync-config-data\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079470 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-combined-ca-bundle\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079503 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6sks\" (UniqueName: \"kubernetes.io/projected/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-kube-api-access-g6sks\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079527 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-scripts\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079558 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhpq\" (UniqueName: \"kubernetes.io/projected/67b878c7-d1cf-4656-8762-7be57cf1491a-kube-api-access-2bhpq\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079627 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-horizon-secret-key\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079687 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzljb\" (UniqueName: \"kubernetes.io/projected/f0db6312-44e4-4765-aabf-1d8620893756-kube-api-access-rzljb\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079754 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-combined-ca-bundle\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.079781 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-config\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.094290 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-scripts\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.095664 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-config-data\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.104326 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-logs\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.113159 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.117705 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-horizon-secret-key\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.120777 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-85swm"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.151134 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6sks\" (UniqueName: \"kubernetes.io/projected/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-kube-api-access-g6sks\") pod \"horizon-bd5cf7c7f-sk89q\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.151233 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wk4q4"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.190182 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-combined-ca-bundle\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.191263 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-config\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.191481 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-db-sync-config-data\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.191625 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-combined-ca-bundle\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.191779 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhpq\" (UniqueName: \"kubernetes.io/projected/67b878c7-d1cf-4656-8762-7be57cf1491a-kube-api-access-2bhpq\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.194864 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzljb\" (UniqueName: \"kubernetes.io/projected/f0db6312-44e4-4765-aabf-1d8620893756-kube-api-access-rzljb\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.206439 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-combined-ca-bundle\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.219887 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhpq\" (UniqueName: \"kubernetes.io/projected/67b878c7-d1cf-4656-8762-7be57cf1491a-kube-api-access-2bhpq\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.227102 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzljb\" (UniqueName: \"kubernetes.io/projected/f0db6312-44e4-4765-aabf-1d8620893756-kube-api-access-rzljb\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.229563 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-combined-ca-bundle\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.229874 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k62d9" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.233911 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-config\") pod \"neutron-db-sync-wk4q4\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.246474 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.251096 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-db-sync-config-data\") pod \"barbican-db-sync-85swm\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.271485 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.286587 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lzxzm"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.325978 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-8jdnn"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.327712 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.343105 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-8jdnn"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.351248 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2s85z"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.356374 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-85swm" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.356475 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.360450 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.360778 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.361311 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6z6nd" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.386547 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.389959 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.393156 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.393437 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-94ddw" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.394572 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.394873 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.400802 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2s85z"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402069 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402114 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402171 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbxd\" (UniqueName: \"kubernetes.io/projected/44517f61-1ece-44c0-8831-4a3e5d188c0f-kube-api-access-wzbxd\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402202 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-scripts\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402266 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbjgr\" (UniqueName: \"kubernetes.io/projected/86c9857f-704f-48ce-b90b-9275e9eba41a-kube-api-access-pbjgr\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402284 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402303 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-config\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402330 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-config-data\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402355 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402372 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c9857f-704f-48ce-b90b-9275e9eba41a-logs\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402412 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402454 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-scripts\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402501 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-combined-ca-bundle\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402518 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-logs\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402551 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402572 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402602 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-config-data\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402631 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bjwv\" (UniqueName: \"kubernetes.io/projected/076cf316-66db-40bb-b3bd-bb4d35ebe53b-kube-api-access-8bjwv\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.402674 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.420309 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.420870 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504554 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbjgr\" (UniqueName: \"kubernetes.io/projected/86c9857f-704f-48ce-b90b-9275e9eba41a-kube-api-access-pbjgr\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504607 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504629 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-config\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504655 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-config-data\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504689 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c9857f-704f-48ce-b90b-9275e9eba41a-logs\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504712 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504766 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504794 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-scripts\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504845 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-combined-ca-bundle\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504860 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-logs\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504899 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504918 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504938 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-config-data\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504975 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bjwv\" (UniqueName: \"kubernetes.io/projected/076cf316-66db-40bb-b3bd-bb4d35ebe53b-kube-api-access-8bjwv\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.504998 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.505035 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.505070 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.505093 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbxd\" (UniqueName: \"kubernetes.io/projected/44517f61-1ece-44c0-8831-4a3e5d188c0f-kube-api-access-wzbxd\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.505112 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-scripts\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.508346 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c9857f-704f-48ce-b90b-9275e9eba41a-logs\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.508826 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.511020 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-logs\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.511088 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-config\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.511359 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.511593 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.512192 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.512675 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.522117 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-combined-ca-bundle\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.524800 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-scripts\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.526364 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.531522 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.536028 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-scripts\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.536848 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-config-data\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.538747 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-config-data\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.537737 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.542947 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbjgr\" (UniqueName: \"kubernetes.io/projected/86c9857f-704f-48ce-b90b-9275e9eba41a-kube-api-access-pbjgr\") pod \"placement-db-sync-2s85z\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.552530 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbxd\" (UniqueName: \"kubernetes.io/projected/44517f61-1ece-44c0-8831-4a3e5d188c0f-kube-api-access-wzbxd\") pod \"dnsmasq-dns-785d8bcb8c-8jdnn\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.555313 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bjwv\" (UniqueName: \"kubernetes.io/projected/076cf316-66db-40bb-b3bd-bb4d35ebe53b-kube-api-access-8bjwv\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.561479 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.577123 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6w9g2"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.681348 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.692757 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2s85z" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.707601 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.783405 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.801539 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.812674 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.813060 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.820460 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.878238 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b4fbb8c45-zw22m"] Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.923877 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.924198 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.924230 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.924253 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.924273 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.924402 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjtzb\" (UniqueName: \"kubernetes.io/projected/9a4866d4-0e60-44bd-855a-32b1809ca5c3-kube-api-access-qjtzb\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.924434 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:17 crc kubenswrapper[4911]: I0310 14:23:17.924465 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.025862 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.025918 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.025948 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.025974 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.026000 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.026082 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjtzb\" (UniqueName: \"kubernetes.io/projected/9a4866d4-0e60-44bd-855a-32b1809ca5c3-kube-api-access-qjtzb\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.026119 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.026158 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.026552 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.026965 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.027120 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.034925 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.035484 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.036087 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.036186 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.072563 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.074437 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjtzb\" (UniqueName: \"kubernetes.io/projected/9a4866d4-0e60-44bd-855a-32b1809ca5c3-kube-api-access-qjtzb\") pod \"glance-default-internal-api-0\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.129339 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bd5cf7c7f-sk89q"] Mar 10 14:23:18 crc kubenswrapper[4911]: W0310 14:23:18.129530 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod711535ec_b6c0_40b0_8aa0_8e7d5d50d3b9.slice/crio-b49ea2bd7f15edb14828deec5e303dfd3cc02df14ef43a938c1189e874a0decd WatchSource:0}: Error finding container b49ea2bd7f15edb14828deec5e303dfd3cc02df14ef43a938c1189e874a0decd: Status 404 returned error can't find the container with id b49ea2bd7f15edb14828deec5e303dfd3cc02df14ef43a938c1189e874a0decd Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.259680 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6w9g2" event={"ID":"27a109ba-d4dc-4947-82e0-59bfabd3ae4f","Type":"ContainerStarted","Data":"051705810d7dccfba382dd3136cb032e1f4847adc0a17c010e8866398f0209d6"} Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.260082 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6w9g2" event={"ID":"27a109ba-d4dc-4947-82e0-59bfabd3ae4f","Type":"ContainerStarted","Data":"dc2658a4fc044606cebce6c25c58371d128a00a13874c4df6f8c9e40c7ab2c7a"} Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.268051 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd5cf7c7f-sk89q" event={"ID":"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9","Type":"ContainerStarted","Data":"b49ea2bd7f15edb14828deec5e303dfd3cc02df14ef43a938c1189e874a0decd"} Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.269393 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.273897 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b4fbb8c45-zw22m" event={"ID":"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3","Type":"ContainerStarted","Data":"33971c6adf4442c1dab8dfad1d0c481829d1949a35367bb8c6c42ca912d2d146"} Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.318254 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-85swm"] Mar 10 14:23:18 crc kubenswrapper[4911]: W0310 14:23:18.324071 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67b878c7_d1cf_4656_8762_7be57cf1491a.slice/crio-8a5ff338a4c1800c1f12ee0b62ffe0ebd429be69d9ae20da1a08b406bea7166c WatchSource:0}: Error finding container 8a5ff338a4c1800c1f12ee0b62ffe0ebd429be69d9ae20da1a08b406bea7166c: Status 404 returned error can't find the container with id 8a5ff338a4c1800c1f12ee0b62ffe0ebd429be69d9ae20da1a08b406bea7166c Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.355463 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-k62d9"] Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.374171 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lzxzm"] Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.387174 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.822054 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wk4q4"] Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.831141 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2s85z"] Mar 10 14:23:18 crc kubenswrapper[4911]: I0310 14:23:18.840032 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-8jdnn"] Mar 10 14:23:18 crc kubenswrapper[4911]: W0310 14:23:18.869548 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44517f61_1ece_44c0_8831_4a3e5d188c0f.slice/crio-f586471956fb5344c198b8def8f98a837ce06ecb6d9637902609d480fec1343e WatchSource:0}: Error finding container f586471956fb5344c198b8def8f98a837ce06ecb6d9637902609d480fec1343e: Status 404 returned error can't find the container with id f586471956fb5344c198b8def8f98a837ce06ecb6d9637902609d480fec1343e Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.080032 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.254811 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.280957 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.321415 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b4fbb8c45-zw22m"] Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.330283 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2s85z" event={"ID":"86c9857f-704f-48ce-b90b-9275e9eba41a","Type":"ContainerStarted","Data":"dd421ac8274f2e98374205cf457ffae84d5956847890feb5e1f733c7496a9adc"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.356899 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54cf4db59f-7jqnh"] Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.375925 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076cf316-66db-40bb-b3bd-bb4d35ebe53b","Type":"ContainerStarted","Data":"6ffdce281bc1c9dd7152e3074c537e79ea89f794058bf49fee00437920597edb"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.375985 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k62d9" event={"ID":"9942f116-fd81-4e92-bd0f-add9b12b4c08","Type":"ContainerStarted","Data":"e1d584d6deab54a0744f4f1a28bd19127f2e71d21515005941a42a78de7099db"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.376006 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerStarted","Data":"0a00b1ddede815fc87526cf7ea6fd44510c226b1ae58cfae4b10d42d81dfb6d8"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.376129 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.377240 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" event={"ID":"44517f61-1ece-44c0-8831-4a3e5d188c0f","Type":"ContainerStarted","Data":"f586471956fb5344c198b8def8f98a837ce06ecb6d9637902609d480fec1343e"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.397783 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54cf4db59f-7jqnh"] Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.406201 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wk4q4" event={"ID":"f0db6312-44e4-4765-aabf-1d8620893756","Type":"ContainerStarted","Data":"6676f76e0f2c49fe4bd4639bf012bbaa6ff65f2cf0481da730a548210f508dff"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.428314 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.446072 4911 generic.go:334] "Generic (PLEG): container finished" podID="dc8244f1-07ac-4d43-888b-b7b5f8c792ea" containerID="db3a52e34798eeeb043bbcb82ebb0b74112f5252170a8d2650c7c8ac1436defb" exitCode=0 Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.446184 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" event={"ID":"dc8244f1-07ac-4d43-888b-b7b5f8c792ea","Type":"ContainerDied","Data":"db3a52e34798eeeb043bbcb82ebb0b74112f5252170a8d2650c7c8ac1436defb"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.446218 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" event={"ID":"dc8244f1-07ac-4d43-888b-b7b5f8c792ea","Type":"ContainerStarted","Data":"8646429cd7a54c03bfd4e60723b6ae7085ec19ba42fc0402735fc8b5e68d5151"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.464883 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.470600 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-85swm" event={"ID":"67b878c7-d1cf-4656-8762-7be57cf1491a","Type":"ContainerStarted","Data":"8a5ff338a4c1800c1f12ee0b62ffe0ebd429be69d9ae20da1a08b406bea7166c"} Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.502526 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nn5v\" (UniqueName: \"kubernetes.io/projected/5046ce07-0ef5-4efc-aaf4-0867d7acea09-kube-api-access-8nn5v\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.502583 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-scripts\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.502628 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-config-data\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.502652 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5046ce07-0ef5-4efc-aaf4-0867d7acea09-logs\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.502679 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5046ce07-0ef5-4efc-aaf4-0867d7acea09-horizon-secret-key\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.518746 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wk4q4" podStartSLOduration=3.518701481 podStartE2EDuration="3.518701481s" podCreationTimestamp="2026-03-10 14:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:19.463686294 +0000 UTC m=+1304.027206201" watchObservedRunningTime="2026-03-10 14:23:19.518701481 +0000 UTC m=+1304.082221398" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.540665 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6w9g2" podStartSLOduration=3.540640254 podStartE2EDuration="3.540640254s" podCreationTimestamp="2026-03-10 14:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:19.527108901 +0000 UTC m=+1304.090628818" watchObservedRunningTime="2026-03-10 14:23:19.540640254 +0000 UTC m=+1304.104160171" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.605055 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nn5v\" (UniqueName: \"kubernetes.io/projected/5046ce07-0ef5-4efc-aaf4-0867d7acea09-kube-api-access-8nn5v\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.605149 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-scripts\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.605235 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-config-data\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.605270 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5046ce07-0ef5-4efc-aaf4-0867d7acea09-logs\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.605302 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5046ce07-0ef5-4efc-aaf4-0867d7acea09-horizon-secret-key\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.607282 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-scripts\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.608083 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5046ce07-0ef5-4efc-aaf4-0867d7acea09-logs\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.612390 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-config-data\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.616617 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5046ce07-0ef5-4efc-aaf4-0867d7acea09-horizon-secret-key\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.650238 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nn5v\" (UniqueName: \"kubernetes.io/projected/5046ce07-0ef5-4efc-aaf4-0867d7acea09-kube-api-access-8nn5v\") pod \"horizon-54cf4db59f-7jqnh\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:19 crc kubenswrapper[4911]: I0310 14:23:19.739907 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.129937 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.230386 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-svc\") pod \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.230452 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsgzm\" (UniqueName: \"kubernetes.io/projected/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-kube-api-access-dsgzm\") pod \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.230524 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-sb\") pod \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.230621 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-nb\") pod \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.230699 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-config\") pod \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.230901 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-swift-storage-0\") pod \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\" (UID: \"dc8244f1-07ac-4d43-888b-b7b5f8c792ea\") " Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.241078 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-kube-api-access-dsgzm" (OuterVolumeSpecName: "kube-api-access-dsgzm") pod "dc8244f1-07ac-4d43-888b-b7b5f8c792ea" (UID: "dc8244f1-07ac-4d43-888b-b7b5f8c792ea"). InnerVolumeSpecName "kube-api-access-dsgzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.260081 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc8244f1-07ac-4d43-888b-b7b5f8c792ea" (UID: "dc8244f1-07ac-4d43-888b-b7b5f8c792ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.260467 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dc8244f1-07ac-4d43-888b-b7b5f8c792ea" (UID: "dc8244f1-07ac-4d43-888b-b7b5f8c792ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.263614 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-config" (OuterVolumeSpecName: "config") pod "dc8244f1-07ac-4d43-888b-b7b5f8c792ea" (UID: "dc8244f1-07ac-4d43-888b-b7b5f8c792ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.265576 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc8244f1-07ac-4d43-888b-b7b5f8c792ea" (UID: "dc8244f1-07ac-4d43-888b-b7b5f8c792ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.267331 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc8244f1-07ac-4d43-888b-b7b5f8c792ea" (UID: "dc8244f1-07ac-4d43-888b-b7b5f8c792ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.338006 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.338038 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.338054 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.338069 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.338080 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsgzm\" (UniqueName: \"kubernetes.io/projected/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-kube-api-access-dsgzm\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.338090 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc8244f1-07ac-4d43-888b-b7b5f8c792ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.508861 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54cf4db59f-7jqnh"] Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.569633 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" event={"ID":"dc8244f1-07ac-4d43-888b-b7b5f8c792ea","Type":"ContainerDied","Data":"8646429cd7a54c03bfd4e60723b6ae7085ec19ba42fc0402735fc8b5e68d5151"} Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.569735 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-lzxzm" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.569765 4911 scope.go:117] "RemoveContainer" containerID="db3a52e34798eeeb043bbcb82ebb0b74112f5252170a8d2650c7c8ac1436defb" Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.597534 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076cf316-66db-40bb-b3bd-bb4d35ebe53b","Type":"ContainerStarted","Data":"87bf520dc0bf408a1604ba94be75b81792ccfe3d1b19d6c7f343973225cdc462"} Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.612065 4911 generic.go:334] "Generic (PLEG): container finished" podID="44517f61-1ece-44c0-8831-4a3e5d188c0f" containerID="525e76ed2e8269372d610c9c40a8d29d78fbf71de09527d59813ce36f53b7785" exitCode=0 Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.612218 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" event={"ID":"44517f61-1ece-44c0-8831-4a3e5d188c0f","Type":"ContainerDied","Data":"525e76ed2e8269372d610c9c40a8d29d78fbf71de09527d59813ce36f53b7785"} Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.646029 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wk4q4" event={"ID":"f0db6312-44e4-4765-aabf-1d8620893756","Type":"ContainerStarted","Data":"7e0018a8a78228a04af27046b61e98bbd97a4f3e18c6cee45893c102e2ebb75b"} Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.662881 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a4866d4-0e60-44bd-855a-32b1809ca5c3","Type":"ContainerStarted","Data":"b2c0aeadf180e517f78fd504cbf9cdba5bb76f906b4b2f0ab73f2502b24897bd"} Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.663023 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a4866d4-0e60-44bd-855a-32b1809ca5c3","Type":"ContainerStarted","Data":"e22a902c675ee0dde27db9112d092f19ff92851fda580775970692042df651b6"} Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.736208 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lzxzm"] Mar 10 14:23:20 crc kubenswrapper[4911]: I0310 14:23:20.760522 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lzxzm"] Mar 10 14:23:21 crc kubenswrapper[4911]: I0310 14:23:21.681032 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54cf4db59f-7jqnh" event={"ID":"5046ce07-0ef5-4efc-aaf4-0867d7acea09","Type":"ContainerStarted","Data":"50d04e31115973ce6dd5d4a98c5f413e55e28be2c72c9ee25a7b43ac0abd8dd6"} Mar 10 14:23:21 crc kubenswrapper[4911]: I0310 14:23:21.684121 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" event={"ID":"44517f61-1ece-44c0-8831-4a3e5d188c0f","Type":"ContainerStarted","Data":"5075f9aa56253f4c1187726c7b20ff42243972d7195226fb17d3d88487655ace"} Mar 10 14:23:21 crc kubenswrapper[4911]: I0310 14:23:21.685735 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:21 crc kubenswrapper[4911]: I0310 14:23:21.689838 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a4866d4-0e60-44bd-855a-32b1809ca5c3","Type":"ContainerStarted","Data":"d387f57d1dfe0f5bc6341bccf0261e12c888379e816e0853d29553752ef3109b"} Mar 10 14:23:21 crc kubenswrapper[4911]: I0310 14:23:21.718215 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" podStartSLOduration=4.718195679 podStartE2EDuration="4.718195679s" podCreationTimestamp="2026-03-10 14:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:21.717382538 +0000 UTC m=+1306.280902465" watchObservedRunningTime="2026-03-10 14:23:21.718195679 +0000 UTC m=+1306.281715586" Mar 10 14:23:22 crc kubenswrapper[4911]: I0310 14:23:22.211704 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8244f1-07ac-4d43-888b-b7b5f8c792ea" path="/var/lib/kubelet/pods/dc8244f1-07ac-4d43-888b-b7b5f8c792ea/volumes" Mar 10 14:23:22 crc kubenswrapper[4911]: I0310 14:23:22.704077 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076cf316-66db-40bb-b3bd-bb4d35ebe53b","Type":"ContainerStarted","Data":"627e04343ef71b078e29027642a423ae4c3db5daa0055df3e1697db87a6a0daf"} Mar 10 14:23:22 crc kubenswrapper[4911]: I0310 14:23:22.704122 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerName="glance-log" containerID="cri-o://b2c0aeadf180e517f78fd504cbf9cdba5bb76f906b4b2f0ab73f2502b24897bd" gracePeriod=30 Mar 10 14:23:22 crc kubenswrapper[4911]: I0310 14:23:22.704679 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerName="glance-log" containerID="cri-o://87bf520dc0bf408a1604ba94be75b81792ccfe3d1b19d6c7f343973225cdc462" gracePeriod=30 Mar 10 14:23:22 crc kubenswrapper[4911]: I0310 14:23:22.704691 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerName="glance-httpd" containerID="cri-o://d387f57d1dfe0f5bc6341bccf0261e12c888379e816e0853d29553752ef3109b" gracePeriod=30 Mar 10 14:23:22 crc kubenswrapper[4911]: I0310 14:23:22.704791 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerName="glance-httpd" containerID="cri-o://627e04343ef71b078e29027642a423ae4c3db5daa0055df3e1697db87a6a0daf" gracePeriod=30 Mar 10 14:23:22 crc kubenswrapper[4911]: I0310 14:23:22.762169 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.76213147 podStartE2EDuration="6.76213147s" podCreationTimestamp="2026-03-10 14:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:22.754137691 +0000 UTC m=+1307.317657608" watchObservedRunningTime="2026-03-10 14:23:22.76213147 +0000 UTC m=+1307.325651387" Mar 10 14:23:22 crc kubenswrapper[4911]: I0310 14:23:22.784372 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.78435071 podStartE2EDuration="5.78435071s" podCreationTimestamp="2026-03-10 14:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:22.783826236 +0000 UTC m=+1307.347346153" watchObservedRunningTime="2026-03-10 14:23:22.78435071 +0000 UTC m=+1307.347870617" Mar 10 14:23:23 crc kubenswrapper[4911]: I0310 14:23:23.722493 4911 generic.go:334] "Generic (PLEG): container finished" podID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerID="d387f57d1dfe0f5bc6341bccf0261e12c888379e816e0853d29553752ef3109b" exitCode=0 Mar 10 14:23:23 crc kubenswrapper[4911]: I0310 14:23:23.722824 4911 generic.go:334] "Generic (PLEG): container finished" podID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerID="b2c0aeadf180e517f78fd504cbf9cdba5bb76f906b4b2f0ab73f2502b24897bd" exitCode=143 Mar 10 14:23:23 crc kubenswrapper[4911]: I0310 14:23:23.722875 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a4866d4-0e60-44bd-855a-32b1809ca5c3","Type":"ContainerDied","Data":"d387f57d1dfe0f5bc6341bccf0261e12c888379e816e0853d29553752ef3109b"} Mar 10 14:23:23 crc kubenswrapper[4911]: I0310 14:23:23.722912 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a4866d4-0e60-44bd-855a-32b1809ca5c3","Type":"ContainerDied","Data":"b2c0aeadf180e517f78fd504cbf9cdba5bb76f906b4b2f0ab73f2502b24897bd"} Mar 10 14:23:23 crc kubenswrapper[4911]: I0310 14:23:23.726008 4911 generic.go:334] "Generic (PLEG): container finished" podID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerID="627e04343ef71b078e29027642a423ae4c3db5daa0055df3e1697db87a6a0daf" exitCode=0 Mar 10 14:23:23 crc kubenswrapper[4911]: I0310 14:23:23.726029 4911 generic.go:334] "Generic (PLEG): container finished" podID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerID="87bf520dc0bf408a1604ba94be75b81792ccfe3d1b19d6c7f343973225cdc462" exitCode=143 Mar 10 14:23:23 crc kubenswrapper[4911]: I0310 14:23:23.727138 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076cf316-66db-40bb-b3bd-bb4d35ebe53b","Type":"ContainerDied","Data":"627e04343ef71b078e29027642a423ae4c3db5daa0055df3e1697db87a6a0daf"} Mar 10 14:23:23 crc kubenswrapper[4911]: I0310 14:23:23.727212 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076cf316-66db-40bb-b3bd-bb4d35ebe53b","Type":"ContainerDied","Data":"87bf520dc0bf408a1604ba94be75b81792ccfe3d1b19d6c7f343973225cdc462"} Mar 10 14:23:24 crc kubenswrapper[4911]: I0310 14:23:24.738671 4911 generic.go:334] "Generic (PLEG): container finished" podID="27a109ba-d4dc-4947-82e0-59bfabd3ae4f" containerID="051705810d7dccfba382dd3136cb032e1f4847adc0a17c010e8866398f0209d6" exitCode=0 Mar 10 14:23:24 crc kubenswrapper[4911]: I0310 14:23:24.738782 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6w9g2" event={"ID":"27a109ba-d4dc-4947-82e0-59bfabd3ae4f","Type":"ContainerDied","Data":"051705810d7dccfba382dd3136cb032e1f4847adc0a17c010e8866398f0209d6"} Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.473013 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bd5cf7c7f-sk89q"] Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.517710 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b4dd68964-gfvp8"] Mar 10 14:23:25 crc kubenswrapper[4911]: E0310 14:23:25.526687 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8244f1-07ac-4d43-888b-b7b5f8c792ea" containerName="init" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.526757 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8244f1-07ac-4d43-888b-b7b5f8c792ea" containerName="init" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.527037 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8244f1-07ac-4d43-888b-b7b5f8c792ea" containerName="init" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.528114 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.536024 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.557586 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b4dd68964-gfvp8"] Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.574839 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlzl6\" (UniqueName: \"kubernetes.io/projected/a546f2b5-3536-4608-b1f4-0127ebd52bfa-kube-api-access-zlzl6\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.574906 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-config-data\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.574948 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a546f2b5-3536-4608-b1f4-0127ebd52bfa-logs\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.574987 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-scripts\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.575103 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-combined-ca-bundle\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.575134 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-tls-certs\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.575152 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-secret-key\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.625032 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54cf4db59f-7jqnh"] Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.674016 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54d884b5d4-lsz26"] Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.675956 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.677286 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlzl6\" (UniqueName: \"kubernetes.io/projected/a546f2b5-3536-4608-b1f4-0127ebd52bfa-kube-api-access-zlzl6\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.677346 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-config-data\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.677392 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a546f2b5-3536-4608-b1f4-0127ebd52bfa-logs\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.677432 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-scripts\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.677508 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-combined-ca-bundle\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.677548 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-tls-certs\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.677566 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-secret-key\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.678515 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a546f2b5-3536-4608-b1f4-0127ebd52bfa-logs\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.680541 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-config-data\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.681539 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-scripts\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.686355 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-secret-key\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.691937 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-combined-ca-bundle\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.696162 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54d884b5d4-lsz26"] Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.703997 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-tls-certs\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.704747 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlzl6\" (UniqueName: \"kubernetes.io/projected/a546f2b5-3536-4608-b1f4-0127ebd52bfa-kube-api-access-zlzl6\") pod \"horizon-7b4dd68964-gfvp8\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.779719 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6be9e57d-52b9-4de2-9201-1b85feda712c-logs\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.780756 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-horizon-tls-certs\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.781002 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6be9e57d-52b9-4de2-9201-1b85feda712c-config-data\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.781086 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l744k\" (UniqueName: \"kubernetes.io/projected/6be9e57d-52b9-4de2-9201-1b85feda712c-kube-api-access-l744k\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.781127 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-combined-ca-bundle\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.781198 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6be9e57d-52b9-4de2-9201-1b85feda712c-scripts\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.781244 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-horizon-secret-key\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.852690 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.883685 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l744k\" (UniqueName: \"kubernetes.io/projected/6be9e57d-52b9-4de2-9201-1b85feda712c-kube-api-access-l744k\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.883773 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-combined-ca-bundle\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.883853 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6be9e57d-52b9-4de2-9201-1b85feda712c-scripts\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.883894 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-horizon-secret-key\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.883960 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6be9e57d-52b9-4de2-9201-1b85feda712c-logs\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.884017 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-horizon-tls-certs\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.884069 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6be9e57d-52b9-4de2-9201-1b85feda712c-config-data\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.885616 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6be9e57d-52b9-4de2-9201-1b85feda712c-config-data\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.895259 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6be9e57d-52b9-4de2-9201-1b85feda712c-logs\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.895741 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-combined-ca-bundle\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.895882 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6be9e57d-52b9-4de2-9201-1b85feda712c-scripts\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.904318 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-horizon-tls-certs\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.904349 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6be9e57d-52b9-4de2-9201-1b85feda712c-horizon-secret-key\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:25 crc kubenswrapper[4911]: I0310 14:23:25.908329 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l744k\" (UniqueName: \"kubernetes.io/projected/6be9e57d-52b9-4de2-9201-1b85feda712c-kube-api-access-l744k\") pod \"horizon-54d884b5d4-lsz26\" (UID: \"6be9e57d-52b9-4de2-9201-1b85feda712c\") " pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:26 crc kubenswrapper[4911]: I0310 14:23:26.165477 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:27 crc kubenswrapper[4911]: I0310 14:23:27.682959 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:27 crc kubenswrapper[4911]: I0310 14:23:27.756541 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kbjd8"] Mar 10 14:23:27 crc kubenswrapper[4911]: I0310 14:23:27.756874 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="dnsmasq-dns" containerID="cri-o://f7582e25553c29861572627ca6db36fb8b1f3f33de8075082558cdb6b996be71" gracePeriod=10 Mar 10 14:23:28 crc kubenswrapper[4911]: I0310 14:23:28.795606 4911 generic.go:334] "Generic (PLEG): container finished" podID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerID="f7582e25553c29861572627ca6db36fb8b1f3f33de8075082558cdb6b996be71" exitCode=0 Mar 10 14:23:28 crc kubenswrapper[4911]: I0310 14:23:28.795865 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" event={"ID":"244b08cc-4c8e-453c-a04c-19972742b5ca","Type":"ContainerDied","Data":"f7582e25553c29861572627ca6db36fb8b1f3f33de8075082558cdb6b996be71"} Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.681622 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.824490 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076cf316-66db-40bb-b3bd-bb4d35ebe53b","Type":"ContainerDied","Data":"6ffdce281bc1c9dd7152e3074c537e79ea89f794058bf49fee00437920597edb"} Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.824548 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ffdce281bc1c9dd7152e3074c537e79ea89f794058bf49fee00437920597edb" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.845794 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.936373 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-logs\") pod \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.936451 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-config-data\") pod \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.936608 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bjwv\" (UniqueName: \"kubernetes.io/projected/076cf316-66db-40bb-b3bd-bb4d35ebe53b-kube-api-access-8bjwv\") pod \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.936712 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-httpd-run\") pod \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.936805 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-public-tls-certs\") pod \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.936864 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-combined-ca-bundle\") pod \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.936904 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.936933 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-scripts\") pod \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\" (UID: \"076cf316-66db-40bb-b3bd-bb4d35ebe53b\") " Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.937032 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-logs" (OuterVolumeSpecName: "logs") pod "076cf316-66db-40bb-b3bd-bb4d35ebe53b" (UID: "076cf316-66db-40bb-b3bd-bb4d35ebe53b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.937296 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "076cf316-66db-40bb-b3bd-bb4d35ebe53b" (UID: "076cf316-66db-40bb-b3bd-bb4d35ebe53b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.937598 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.937611 4911 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076cf316-66db-40bb-b3bd-bb4d35ebe53b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.943768 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-scripts" (OuterVolumeSpecName: "scripts") pod "076cf316-66db-40bb-b3bd-bb4d35ebe53b" (UID: "076cf316-66db-40bb-b3bd-bb4d35ebe53b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.944086 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076cf316-66db-40bb-b3bd-bb4d35ebe53b-kube-api-access-8bjwv" (OuterVolumeSpecName: "kube-api-access-8bjwv") pod "076cf316-66db-40bb-b3bd-bb4d35ebe53b" (UID: "076cf316-66db-40bb-b3bd-bb4d35ebe53b"). InnerVolumeSpecName "kube-api-access-8bjwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.957218 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "076cf316-66db-40bb-b3bd-bb4d35ebe53b" (UID: "076cf316-66db-40bb-b3bd-bb4d35ebe53b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.968683 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "076cf316-66db-40bb-b3bd-bb4d35ebe53b" (UID: "076cf316-66db-40bb-b3bd-bb4d35ebe53b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.982949 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-config-data" (OuterVolumeSpecName: "config-data") pod "076cf316-66db-40bb-b3bd-bb4d35ebe53b" (UID: "076cf316-66db-40bb-b3bd-bb4d35ebe53b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:31 crc kubenswrapper[4911]: I0310 14:23:31.987289 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "076cf316-66db-40bb-b3bd-bb4d35ebe53b" (UID: "076cf316-66db-40bb-b3bd-bb4d35ebe53b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.039190 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bjwv\" (UniqueName: \"kubernetes.io/projected/076cf316-66db-40bb-b3bd-bb4d35ebe53b-kube-api-access-8bjwv\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.039224 4911 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.039234 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.039275 4911 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.039285 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.039294 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076cf316-66db-40bb-b3bd-bb4d35ebe53b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.060610 4911 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.142495 4911 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.836094 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.858538 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.867033 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.885998 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:32 crc kubenswrapper[4911]: E0310 14:23:32.886475 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerName="glance-httpd" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.886492 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerName="glance-httpd" Mar 10 14:23:32 crc kubenswrapper[4911]: E0310 14:23:32.886510 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerName="glance-log" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.886516 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerName="glance-log" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.886714 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerName="glance-httpd" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.886751 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" containerName="glance-log" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.887907 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.892050 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.892407 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 14:23:32 crc kubenswrapper[4911]: I0310 14:23:32.901121 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.061491 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-config-data\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.061576 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.061674 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhz5p\" (UniqueName: \"kubernetes.io/projected/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-kube-api-access-fhz5p\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.061749 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.061770 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-logs\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.061796 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-scripts\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.061822 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.061855 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.163734 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-config-data\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.163813 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.163959 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhz5p\" (UniqueName: \"kubernetes.io/projected/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-kube-api-access-fhz5p\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.164425 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.164573 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.164602 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-logs\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.165015 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-logs\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.165370 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-scripts\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.165408 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.165486 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.166054 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.170088 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-config-data\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.171461 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-scripts\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.171708 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.177653 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.180928 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhz5p\" (UniqueName: \"kubernetes.io/projected/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-kube-api-access-fhz5p\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.206202 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " pod="openstack/glance-default-external-api-0" Mar 10 14:23:33 crc kubenswrapper[4911]: I0310 14:23:33.507378 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:23:34 crc kubenswrapper[4911]: I0310 14:23:34.207986 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076cf316-66db-40bb-b3bd-bb4d35ebe53b" path="/var/lib/kubelet/pods/076cf316-66db-40bb-b3bd-bb4d35ebe53b/volumes" Mar 10 14:23:36 crc kubenswrapper[4911]: I0310 14:23:36.681236 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Mar 10 14:23:39 crc kubenswrapper[4911]: E0310 14:23:39.617586 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 10 14:23:39 crc kubenswrapper[4911]: E0310 14:23:39.618527 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n666h579hd4h65chf4h5dbh54fh66dh9ch698h668h65fh5d9h569h588h569h69h588hcfh665hc5h57h78h94h665h655h648h55ch84h5bbh5d5h655q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gc8w6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6b4fbb8c45-zw22m_openstack(20e4b47f-2e05-4df4-8ba3-f53950a3d2f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:23:39 crc kubenswrapper[4911]: E0310 14:23:39.622344 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6b4fbb8c45-zw22m" podUID="20e4b47f-2e05-4df4-8ba3-f53950a3d2f3" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.710578 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.826343 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-combined-ca-bundle\") pod \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.826516 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-config-data\") pod \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.826574 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-fernet-keys\") pod \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.826627 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-credential-keys\") pod \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.826829 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8xkb\" (UniqueName: \"kubernetes.io/projected/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-kube-api-access-n8xkb\") pod \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.826864 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-scripts\") pod \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\" (UID: \"27a109ba-d4dc-4947-82e0-59bfabd3ae4f\") " Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.834472 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "27a109ba-d4dc-4947-82e0-59bfabd3ae4f" (UID: "27a109ba-d4dc-4947-82e0-59bfabd3ae4f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.847506 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "27a109ba-d4dc-4947-82e0-59bfabd3ae4f" (UID: "27a109ba-d4dc-4947-82e0-59bfabd3ae4f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.848947 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-scripts" (OuterVolumeSpecName: "scripts") pod "27a109ba-d4dc-4947-82e0-59bfabd3ae4f" (UID: "27a109ba-d4dc-4947-82e0-59bfabd3ae4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.852075 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-kube-api-access-n8xkb" (OuterVolumeSpecName: "kube-api-access-n8xkb") pod "27a109ba-d4dc-4947-82e0-59bfabd3ae4f" (UID: "27a109ba-d4dc-4947-82e0-59bfabd3ae4f"). InnerVolumeSpecName "kube-api-access-n8xkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.854984 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-config-data" (OuterVolumeSpecName: "config-data") pod "27a109ba-d4dc-4947-82e0-59bfabd3ae4f" (UID: "27a109ba-d4dc-4947-82e0-59bfabd3ae4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.865105 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27a109ba-d4dc-4947-82e0-59bfabd3ae4f" (UID: "27a109ba-d4dc-4947-82e0-59bfabd3ae4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.912358 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6w9g2" event={"ID":"27a109ba-d4dc-4947-82e0-59bfabd3ae4f","Type":"ContainerDied","Data":"dc2658a4fc044606cebce6c25c58371d128a00a13874c4df6f8c9e40c7ab2c7a"} Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.912405 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6w9g2" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.912419 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc2658a4fc044606cebce6c25c58371d128a00a13874c4df6f8c9e40c7ab2c7a" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.929856 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.929891 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.929905 4911 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.929918 4911 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.929930 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8xkb\" (UniqueName: \"kubernetes.io/projected/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-kube-api-access-n8xkb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:39 crc kubenswrapper[4911]: I0310 14:23:39.929942 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27a109ba-d4dc-4947-82e0-59bfabd3ae4f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.812003 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6w9g2"] Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.820407 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6w9g2"] Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.920314 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9w5xj"] Mar 10 14:23:40 crc kubenswrapper[4911]: E0310 14:23:40.920858 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a109ba-d4dc-4947-82e0-59bfabd3ae4f" containerName="keystone-bootstrap" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.920877 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a109ba-d4dc-4947-82e0-59bfabd3ae4f" containerName="keystone-bootstrap" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.921125 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a109ba-d4dc-4947-82e0-59bfabd3ae4f" containerName="keystone-bootstrap" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.921880 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.924748 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.926420 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.926787 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4vwm6" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.926807 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.927088 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 14:23:40 crc kubenswrapper[4911]: I0310 14:23:40.933980 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9w5xj"] Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.068691 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-combined-ca-bundle\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.068791 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-scripts\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.068891 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-fernet-keys\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.068946 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-config-data\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.069010 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-credential-keys\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.069057 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p82s\" (UniqueName: \"kubernetes.io/projected/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-kube-api-access-5p82s\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.171292 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-combined-ca-bundle\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.171415 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-scripts\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.171495 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-fernet-keys\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.171524 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-config-data\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.171559 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-credential-keys\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.171595 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p82s\" (UniqueName: \"kubernetes.io/projected/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-kube-api-access-5p82s\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.177829 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-combined-ca-bundle\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.178096 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-fernet-keys\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.178423 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-config-data\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.178662 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-scripts\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.179535 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-credential-keys\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.192113 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p82s\" (UniqueName: \"kubernetes.io/projected/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-kube-api-access-5p82s\") pod \"keystone-bootstrap-9w5xj\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.247231 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.681812 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Mar 10 14:23:41 crc kubenswrapper[4911]: I0310 14:23:41.682014 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:23:41 crc kubenswrapper[4911]: E0310 14:23:41.806011 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 10 14:23:41 crc kubenswrapper[4911]: E0310 14:23:41.806219 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbjgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-2s85z_openstack(86c9857f-704f-48ce-b90b-9275e9eba41a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:23:41 crc kubenswrapper[4911]: E0310 14:23:41.807437 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-2s85z" podUID="86c9857f-704f-48ce-b90b-9275e9eba41a" Mar 10 14:23:41 crc kubenswrapper[4911]: E0310 14:23:41.943371 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-2s85z" podUID="86c9857f-704f-48ce-b90b-9275e9eba41a" Mar 10 14:23:42 crc kubenswrapper[4911]: I0310 14:23:42.203654 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a109ba-d4dc-4947-82e0-59bfabd3ae4f" path="/var/lib/kubelet/pods/27a109ba-d4dc-4947-82e0-59bfabd3ae4f/volumes" Mar 10 14:23:43 crc kubenswrapper[4911]: I0310 14:23:43.963104 4911 generic.go:334] "Generic (PLEG): container finished" podID="f0db6312-44e4-4765-aabf-1d8620893756" containerID="7e0018a8a78228a04af27046b61e98bbd97a4f3e18c6cee45893c102e2ebb75b" exitCode=0 Mar 10 14:23:43 crc kubenswrapper[4911]: I0310 14:23:43.963432 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wk4q4" event={"ID":"f0db6312-44e4-4765-aabf-1d8620893756","Type":"ContainerDied","Data":"7e0018a8a78228a04af27046b61e98bbd97a4f3e18c6cee45893c102e2ebb75b"} Mar 10 14:23:44 crc kubenswrapper[4911]: E0310 14:23:44.154842 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 10 14:23:44 crc kubenswrapper[4911]: E0310 14:23:44.155103 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n665h89h68ch565h68fh5b4h98h99hb9h686h588h56h655h578hf4h9bh7hbfh5cfh5dbh64ch689h5f7h67ch569h99h5cfh556h694h545h655h697q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6sks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-bd5cf7c7f-sk89q_openstack(711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:23:44 crc kubenswrapper[4911]: E0310 14:23:44.167336 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-bd5cf7c7f-sk89q" podUID="711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.257036 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.457362 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-scripts\") pod \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.457475 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-config-data\") pod \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.457517 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-logs\") pod \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.457586 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.457645 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-httpd-run\") pod \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.457715 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-internal-tls-certs\") pod \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.457773 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjtzb\" (UniqueName: \"kubernetes.io/projected/9a4866d4-0e60-44bd-855a-32b1809ca5c3-kube-api-access-qjtzb\") pod \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.457802 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-combined-ca-bundle\") pod \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\" (UID: \"9a4866d4-0e60-44bd-855a-32b1809ca5c3\") " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.458545 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-logs" (OuterVolumeSpecName: "logs") pod "9a4866d4-0e60-44bd-855a-32b1809ca5c3" (UID: "9a4866d4-0e60-44bd-855a-32b1809ca5c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.459133 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9a4866d4-0e60-44bd-855a-32b1809ca5c3" (UID: "9a4866d4-0e60-44bd-855a-32b1809ca5c3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.464782 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "9a4866d4-0e60-44bd-855a-32b1809ca5c3" (UID: "9a4866d4-0e60-44bd-855a-32b1809ca5c3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.465324 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-scripts" (OuterVolumeSpecName: "scripts") pod "9a4866d4-0e60-44bd-855a-32b1809ca5c3" (UID: "9a4866d4-0e60-44bd-855a-32b1809ca5c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.466024 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a4866d4-0e60-44bd-855a-32b1809ca5c3-kube-api-access-qjtzb" (OuterVolumeSpecName: "kube-api-access-qjtzb") pod "9a4866d4-0e60-44bd-855a-32b1809ca5c3" (UID: "9a4866d4-0e60-44bd-855a-32b1809ca5c3"). InnerVolumeSpecName "kube-api-access-qjtzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.500091 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a4866d4-0e60-44bd-855a-32b1809ca5c3" (UID: "9a4866d4-0e60-44bd-855a-32b1809ca5c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.508324 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-config-data" (OuterVolumeSpecName: "config-data") pod "9a4866d4-0e60-44bd-855a-32b1809ca5c3" (UID: "9a4866d4-0e60-44bd-855a-32b1809ca5c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.517505 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9a4866d4-0e60-44bd-855a-32b1809ca5c3" (UID: "9a4866d4-0e60-44bd-855a-32b1809ca5c3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.560513 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjtzb\" (UniqueName: \"kubernetes.io/projected/9a4866d4-0e60-44bd-855a-32b1809ca5c3-kube-api-access-qjtzb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.560561 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.560573 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.560585 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.560598 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.560646 4911 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.560657 4911 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4866d4-0e60-44bd-855a-32b1809ca5c3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.560671 4911 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a4866d4-0e60-44bd-855a-32b1809ca5c3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.579160 4911 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.662871 4911 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.976459 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a4866d4-0e60-44bd-855a-32b1809ca5c3","Type":"ContainerDied","Data":"e22a902c675ee0dde27db9112d092f19ff92851fda580775970692042df651b6"} Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.976890 4911 scope.go:117] "RemoveContainer" containerID="d387f57d1dfe0f5bc6341bccf0261e12c888379e816e0853d29553752ef3109b" Mar 10 14:23:44 crc kubenswrapper[4911]: I0310 14:23:44.976549 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.048795 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.057831 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.070547 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:45 crc kubenswrapper[4911]: E0310 14:23:45.071126 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerName="glance-httpd" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.071147 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerName="glance-httpd" Mar 10 14:23:45 crc kubenswrapper[4911]: E0310 14:23:45.071174 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerName="glance-log" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.071183 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerName="glance-log" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.071437 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerName="glance-httpd" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.071460 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" containerName="glance-log" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.072488 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.077623 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.077670 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.091213 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.281837 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.281886 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.281944 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.281980 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.282006 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhcs\" (UniqueName: \"kubernetes.io/projected/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-kube-api-access-4lhcs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.282026 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.282047 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.282108 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.383839 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.383892 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lhcs\" (UniqueName: \"kubernetes.io/projected/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-kube-api-access-4lhcs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.383916 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.383936 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.383977 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.384020 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.384050 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.384090 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.384923 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.384954 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.387386 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.389124 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.389816 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.394462 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.400544 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.410129 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lhcs\" (UniqueName: \"kubernetes.io/projected/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-kube-api-access-4lhcs\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.443978 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: E0310 14:23:45.608851 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 10 14:23:45 crc kubenswrapper[4911]: E0310 14:23:45.609503 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6tjk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-k62d9_openstack(9942f116-fd81-4e92-bd0f-add9b12b4c08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:23:45 crc kubenswrapper[4911]: E0310 14:23:45.610777 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-k62d9" podUID="9942f116-fd81-4e92-bd0f-add9b12b4c08" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.629865 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.701699 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.790349 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-scripts\") pod \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.790715 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-horizon-secret-key\") pod \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.790906 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-config-data\") pod \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.791145 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc8w6\" (UniqueName: \"kubernetes.io/projected/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-kube-api-access-gc8w6\") pod \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.791287 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-logs\") pod \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\" (UID: \"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3\") " Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.791191 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-scripts" (OuterVolumeSpecName: "scripts") pod "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3" (UID: "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.791669 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-config-data" (OuterVolumeSpecName: "config-data") pod "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3" (UID: "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.791689 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-logs" (OuterVolumeSpecName: "logs") pod "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3" (UID: "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.792088 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.792159 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.792224 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.795889 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-kube-api-access-gc8w6" (OuterVolumeSpecName: "kube-api-access-gc8w6") pod "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3" (UID: "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3"). InnerVolumeSpecName "kube-api-access-gc8w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.797713 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3" (UID: "20e4b47f-2e05-4df4-8ba3-f53950a3d2f3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.893689 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc8w6\" (UniqueName: \"kubernetes.io/projected/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-kube-api-access-gc8w6\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.893739 4911 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.989227 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b4fbb8c45-zw22m" Mar 10 14:23:45 crc kubenswrapper[4911]: I0310 14:23:45.989284 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b4fbb8c45-zw22m" event={"ID":"20e4b47f-2e05-4df4-8ba3-f53950a3d2f3","Type":"ContainerDied","Data":"33971c6adf4442c1dab8dfad1d0c481829d1949a35367bb8c6c42ca912d2d146"} Mar 10 14:23:45 crc kubenswrapper[4911]: E0310 14:23:45.991858 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-k62d9" podUID="9942f116-fd81-4e92-bd0f-add9b12b4c08" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.071763 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b4fbb8c45-zw22m"] Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.079209 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b4fbb8c45-zw22m"] Mar 10 14:23:46 crc kubenswrapper[4911]: E0310 14:23:46.127558 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 10 14:23:46 crc kubenswrapper[4911]: E0310 14:23:46.127823 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bhpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-85swm_openstack(67b878c7-d1cf-4656-8762-7be57cf1491a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:23:46 crc kubenswrapper[4911]: E0310 14:23:46.129319 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-85swm" podUID="67b878c7-d1cf-4656-8762-7be57cf1491a" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.138087 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.147090 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.153499 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.210495 4911 scope.go:117] "RemoveContainer" containerID="b2c0aeadf180e517f78fd504cbf9cdba5bb76f906b4b2f0ab73f2502b24897bd" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.223320 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e4b47f-2e05-4df4-8ba3-f53950a3d2f3" path="/var/lib/kubelet/pods/20e4b47f-2e05-4df4-8ba3-f53950a3d2f3/volumes" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.224101 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a4866d4-0e60-44bd-855a-32b1809ca5c3" path="/var/lib/kubelet/pods/9a4866d4-0e60-44bd-855a-32b1809ca5c3/volumes" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.302248 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2br5g\" (UniqueName: \"kubernetes.io/projected/244b08cc-4c8e-453c-a04c-19972742b5ca-kube-api-access-2br5g\") pod \"244b08cc-4c8e-453c-a04c-19972742b5ca\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304590 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-swift-storage-0\") pod \"244b08cc-4c8e-453c-a04c-19972742b5ca\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304677 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-horizon-secret-key\") pod \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304731 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-logs\") pod \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304776 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-config\") pod \"244b08cc-4c8e-453c-a04c-19972742b5ca\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304805 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-sb\") pod \"244b08cc-4c8e-453c-a04c-19972742b5ca\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304861 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-scripts\") pod \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304931 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzljb\" (UniqueName: \"kubernetes.io/projected/f0db6312-44e4-4765-aabf-1d8620893756-kube-api-access-rzljb\") pod \"f0db6312-44e4-4765-aabf-1d8620893756\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304951 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6sks\" (UniqueName: \"kubernetes.io/projected/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-kube-api-access-g6sks\") pod \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.304974 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-nb\") pod \"244b08cc-4c8e-453c-a04c-19972742b5ca\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.305045 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-combined-ca-bundle\") pod \"f0db6312-44e4-4765-aabf-1d8620893756\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.306196 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-config\") pod \"f0db6312-44e4-4765-aabf-1d8620893756\" (UID: \"f0db6312-44e4-4765-aabf-1d8620893756\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.306271 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-config-data\") pod \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\" (UID: \"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.306294 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-svc\") pod \"244b08cc-4c8e-453c-a04c-19972742b5ca\" (UID: \"244b08cc-4c8e-453c-a04c-19972742b5ca\") " Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.307879 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-scripts" (OuterVolumeSpecName: "scripts") pod "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9" (UID: "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.314195 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-config-data" (OuterVolumeSpecName: "config-data") pod "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9" (UID: "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.314341 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244b08cc-4c8e-453c-a04c-19972742b5ca-kube-api-access-2br5g" (OuterVolumeSpecName: "kube-api-access-2br5g") pod "244b08cc-4c8e-453c-a04c-19972742b5ca" (UID: "244b08cc-4c8e-453c-a04c-19972742b5ca"). InnerVolumeSpecName "kube-api-access-2br5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.315506 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-logs" (OuterVolumeSpecName: "logs") pod "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9" (UID: "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.315703 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.315793 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2br5g\" (UniqueName: \"kubernetes.io/projected/244b08cc-4c8e-453c-a04c-19972742b5ca-kube-api-access-2br5g\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.315942 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.315957 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.324071 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0db6312-44e4-4765-aabf-1d8620893756-kube-api-access-rzljb" (OuterVolumeSpecName: "kube-api-access-rzljb") pod "f0db6312-44e4-4765-aabf-1d8620893756" (UID: "f0db6312-44e4-4765-aabf-1d8620893756"). InnerVolumeSpecName "kube-api-access-rzljb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.330675 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-kube-api-access-g6sks" (OuterVolumeSpecName: "kube-api-access-g6sks") pod "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9" (UID: "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9"). InnerVolumeSpecName "kube-api-access-g6sks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.340597 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9" (UID: "711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.372019 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0db6312-44e4-4765-aabf-1d8620893756" (UID: "f0db6312-44e4-4765-aabf-1d8620893756"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.374175 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-config" (OuterVolumeSpecName: "config") pod "f0db6312-44e4-4765-aabf-1d8620893756" (UID: "f0db6312-44e4-4765-aabf-1d8620893756"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.414647 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "244b08cc-4c8e-453c-a04c-19972742b5ca" (UID: "244b08cc-4c8e-453c-a04c-19972742b5ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.417617 4911 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.417654 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzljb\" (UniqueName: \"kubernetes.io/projected/f0db6312-44e4-4765-aabf-1d8620893756-kube-api-access-rzljb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.417666 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6sks\" (UniqueName: \"kubernetes.io/projected/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9-kube-api-access-g6sks\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.417676 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.417687 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.417697 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0db6312-44e4-4765-aabf-1d8620893756-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.420632 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "244b08cc-4c8e-453c-a04c-19972742b5ca" (UID: "244b08cc-4c8e-453c-a04c-19972742b5ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.421548 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "244b08cc-4c8e-453c-a04c-19972742b5ca" (UID: "244b08cc-4c8e-453c-a04c-19972742b5ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.446481 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-config" (OuterVolumeSpecName: "config") pod "244b08cc-4c8e-453c-a04c-19972742b5ca" (UID: "244b08cc-4c8e-453c-a04c-19972742b5ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.448604 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "244b08cc-4c8e-453c-a04c-19972742b5ca" (UID: "244b08cc-4c8e-453c-a04c-19972742b5ca"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.519874 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.519904 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.519915 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.519923 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b08cc-4c8e-453c-a04c-19972742b5ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.739990 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b4dd68964-gfvp8"] Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.829263 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54d884b5d4-lsz26"] Mar 10 14:23:46 crc kubenswrapper[4911]: W0310 14:23:46.830119 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6be9e57d_52b9_4de2_9201_1b85feda712c.slice/crio-d662a668c37583430349490a6bcf95d132e4d5f26ce9db391a59ee106235cce7 WatchSource:0}: Error finding container d662a668c37583430349490a6bcf95d132e4d5f26ce9db391a59ee106235cce7: Status 404 returned error can't find the container with id d662a668c37583430349490a6bcf95d132e4d5f26ce9db391a59ee106235cce7 Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.917174 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9w5xj"] Mar 10 14:23:46 crc kubenswrapper[4911]: I0310 14:23:46.935656 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:23:46 crc kubenswrapper[4911]: W0310 14:23:46.959899 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f554253_d7fc_4ca5_8f4b_fafd051fbee4.slice/crio-fed1386d7badb8d274a2aab0ca439a2c0922ac8aee32db185ddca43b2e7a42cd WatchSource:0}: Error finding container fed1386d7badb8d274a2aab0ca439a2c0922ac8aee32db185ddca43b2e7a42cd: Status 404 returned error can't find the container with id fed1386d7badb8d274a2aab0ca439a2c0922ac8aee32db185ddca43b2e7a42cd Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.015576 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wk4q4" event={"ID":"f0db6312-44e4-4765-aabf-1d8620893756","Type":"ContainerDied","Data":"6676f76e0f2c49fe4bd4639bf012bbaa6ff65f2cf0481da730a548210f508dff"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.015633 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6676f76e0f2c49fe4bd4639bf012bbaa6ff65f2cf0481da730a548210f508dff" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.015706 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wk4q4" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.018036 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.024750 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" event={"ID":"244b08cc-4c8e-453c-a04c-19972742b5ca","Type":"ContainerDied","Data":"b9c34c15df9f6b64bdab10edddc533c76e7bc03dc5725ab96f7f5ac248705368"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.024864 4911 scope.go:117] "RemoveContainer" containerID="f7582e25553c29861572627ca6db36fb8b1f3f33de8075082558cdb6b996be71" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.025064 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kbjd8" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.029532 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd5cf7c7f-sk89q" event={"ID":"711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9","Type":"ContainerDied","Data":"b49ea2bd7f15edb14828deec5e303dfd3cc02df14ef43a938c1189e874a0decd"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.029559 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd5cf7c7f-sk89q" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.036354 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b58550bb-f4f6-40ef-9424-7bfd05d57c9d","Type":"ContainerStarted","Data":"22aa495f77854b28791c6198eef5f790b40e5b72fbf7ac5106488db6c05cab8e"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.038965 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54cf4db59f-7jqnh" event={"ID":"5046ce07-0ef5-4efc-aaf4-0867d7acea09","Type":"ContainerStarted","Data":"9830e5c56d818f9394253b818790bc839d6a9e709579ea217ea4f8467a1ed1a3"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.038999 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54cf4db59f-7jqnh" event={"ID":"5046ce07-0ef5-4efc-aaf4-0867d7acea09","Type":"ContainerStarted","Data":"a869305265bf235adabf35ccd01a2083f5054d9da239357e1f4078c1119f14e2"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.039144 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54cf4db59f-7jqnh" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerName="horizon-log" containerID="cri-o://a869305265bf235adabf35ccd01a2083f5054d9da239357e1f4078c1119f14e2" gracePeriod=30 Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.039431 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54cf4db59f-7jqnh" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerName="horizon" containerID="cri-o://9830e5c56d818f9394253b818790bc839d6a9e709579ea217ea4f8467a1ed1a3" gracePeriod=30 Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.044211 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d884b5d4-lsz26" event={"ID":"6be9e57d-52b9-4de2-9201-1b85feda712c","Type":"ContainerStarted","Data":"d662a668c37583430349490a6bcf95d132e4d5f26ce9db391a59ee106235cce7"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.045527 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerStarted","Data":"6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.048240 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9w5xj" event={"ID":"6f554253-d7fc-4ca5-8f4b-fafd051fbee4","Type":"ContainerStarted","Data":"fed1386d7badb8d274a2aab0ca439a2c0922ac8aee32db185ddca43b2e7a42cd"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.052306 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4dd68964-gfvp8" event={"ID":"a546f2b5-3536-4608-b1f4-0127ebd52bfa","Type":"ContainerStarted","Data":"33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792"} Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.052377 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4dd68964-gfvp8" event={"ID":"a546f2b5-3536-4608-b1f4-0127ebd52bfa","Type":"ContainerStarted","Data":"b5b48fac898470bc81f43e5d81e9ece261227a7f678074e6d5b7d82735e90ea8"} Mar 10 14:23:47 crc kubenswrapper[4911]: E0310 14:23:47.054233 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-85swm" podUID="67b878c7-d1cf-4656-8762-7be57cf1491a" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.066854 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54cf4db59f-7jqnh" podStartSLOduration=2.557808189 podStartE2EDuration="28.066827404s" podCreationTimestamp="2026-03-10 14:23:19 +0000 UTC" firstStartedPulling="2026-03-10 14:23:20.600278916 +0000 UTC m=+1305.163798833" lastFinishedPulling="2026-03-10 14:23:46.109297971 +0000 UTC m=+1330.672818048" observedRunningTime="2026-03-10 14:23:47.062587143 +0000 UTC m=+1331.626107060" watchObservedRunningTime="2026-03-10 14:23:47.066827404 +0000 UTC m=+1331.630347321" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.109069 4911 scope.go:117] "RemoveContainer" containerID="bfd1dce16419efeb13bbeaf36197ce64a637857c033483352cb3414a48fe8782" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.135069 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kbjd8"] Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.181893 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kbjd8"] Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.199356 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bd5cf7c7f-sk89q"] Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.209538 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bd5cf7c7f-sk89q"] Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.455677 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-xgm4j"] Mar 10 14:23:47 crc kubenswrapper[4911]: E0310 14:23:47.462259 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0db6312-44e4-4765-aabf-1d8620893756" containerName="neutron-db-sync" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.462300 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0db6312-44e4-4765-aabf-1d8620893756" containerName="neutron-db-sync" Mar 10 14:23:47 crc kubenswrapper[4911]: E0310 14:23:47.462356 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="init" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.462363 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="init" Mar 10 14:23:47 crc kubenswrapper[4911]: E0310 14:23:47.462382 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="dnsmasq-dns" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.462388 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="dnsmasq-dns" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.462742 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" containerName="dnsmasq-dns" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.462773 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0db6312-44e4-4765-aabf-1d8620893756" containerName="neutron-db-sync" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.463872 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.492784 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-xgm4j"] Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.587582 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.587633 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.587655 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.587685 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-config\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.587715 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-svc\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.587797 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz57k\" (UniqueName: \"kubernetes.io/projected/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-kube-api-access-bz57k\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.615876 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cf9dfcf44-sls8z"] Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.617810 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.621597 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s462l" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.621793 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.622012 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.622247 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.634861 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cf9dfcf44-sls8z"] Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690176 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz57k\" (UniqueName: \"kubernetes.io/projected/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-kube-api-access-bz57k\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690279 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-ovndb-tls-certs\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690335 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-combined-ca-bundle\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690385 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690421 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690446 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690484 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-config\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690515 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-httpd-config\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690548 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-svc\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690661 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-config\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.690704 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h85t2\" (UniqueName: \"kubernetes.io/projected/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-kube-api-access-h85t2\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.692140 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.693048 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.693685 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.694235 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-config\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.695590 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-svc\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.715980 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz57k\" (UniqueName: \"kubernetes.io/projected/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-kube-api-access-bz57k\") pod \"dnsmasq-dns-55f844cf75-xgm4j\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.795136 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-ovndb-tls-certs\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.796138 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-combined-ca-bundle\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.796285 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-httpd-config\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.796317 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-config\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.796366 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h85t2\" (UniqueName: \"kubernetes.io/projected/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-kube-api-access-h85t2\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.800030 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-ovndb-tls-certs\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.805173 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-combined-ca-bundle\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.814599 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-httpd-config\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.815659 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-config\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.826291 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h85t2\" (UniqueName: \"kubernetes.io/projected/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-kube-api-access-h85t2\") pod \"neutron-cf9dfcf44-sls8z\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.864781 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:47 crc kubenswrapper[4911]: I0310 14:23:47.980849 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.107660 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b58550bb-f4f6-40ef-9424-7bfd05d57c9d","Type":"ContainerStarted","Data":"09a287c0af9ac56378919e06cbb2868faa3a49db885c95daf22c889f9f356e32"} Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.113381 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d884b5d4-lsz26" event={"ID":"6be9e57d-52b9-4de2-9201-1b85feda712c","Type":"ContainerStarted","Data":"8750bc9c78bce59fe9763c4ae792046774ec6179ff8f373adc35a4807ffd929a"} Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.113428 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d884b5d4-lsz26" event={"ID":"6be9e57d-52b9-4de2-9201-1b85feda712c","Type":"ContainerStarted","Data":"3254e1ee9079ab7710a478489f03fc1e6df691391edc0297d7baeee2052abb9c"} Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.133820 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9w5xj" event={"ID":"6f554253-d7fc-4ca5-8f4b-fafd051fbee4","Type":"ContainerStarted","Data":"a7685173559e3dc24a03bb6e10a53a23002da938421bb4502acd6723ddc4e6d2"} Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.150838 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54d884b5d4-lsz26" podStartSLOduration=23.150810851 podStartE2EDuration="23.150810851s" podCreationTimestamp="2026-03-10 14:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:48.139239529 +0000 UTC m=+1332.702759556" watchObservedRunningTime="2026-03-10 14:23:48.150810851 +0000 UTC m=+1332.714330768" Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.175430 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4428f0a9-5fdc-4bdb-931e-da67fb9498c9","Type":"ContainerStarted","Data":"9e52857bca752217a2557471d292bc9df5ae5ebd923dc0237feb2ab9bdfbd7f8"} Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.178943 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4dd68964-gfvp8" event={"ID":"a546f2b5-3536-4608-b1f4-0127ebd52bfa","Type":"ContainerStarted","Data":"dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a"} Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.181240 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9w5xj" podStartSLOduration=8.181224066 podStartE2EDuration="8.181224066s" podCreationTimestamp="2026-03-10 14:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:48.173281018 +0000 UTC m=+1332.736800935" watchObservedRunningTime="2026-03-10 14:23:48.181224066 +0000 UTC m=+1332.744743983" Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.210758 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244b08cc-4c8e-453c-a04c-19972742b5ca" path="/var/lib/kubelet/pods/244b08cc-4c8e-453c-a04c-19972742b5ca/volumes" Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.211671 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9" path="/var/lib/kubelet/pods/711535ec-b6c0-40b0-8aa0-8e7d5d50d3b9/volumes" Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.212254 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b4dd68964-gfvp8" podStartSLOduration=23.212233146 podStartE2EDuration="23.212233146s" podCreationTimestamp="2026-03-10 14:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:48.203367584 +0000 UTC m=+1332.766887511" watchObservedRunningTime="2026-03-10 14:23:48.212233146 +0000 UTC m=+1332.775753063" Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.509122 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-xgm4j"] Mar 10 14:23:48 crc kubenswrapper[4911]: I0310 14:23:48.944130 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cf9dfcf44-sls8z"] Mar 10 14:23:49 crc kubenswrapper[4911]: I0310 14:23:49.200258 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" event={"ID":"7a0a0bdc-e60b-4a14-b244-49619bde6bd6","Type":"ContainerStarted","Data":"1ea24f2c2e21cead37b8d9fb0a88dac205d7adcc0522147994e2d4d97fd46278"} Mar 10 14:23:49 crc kubenswrapper[4911]: I0310 14:23:49.202689 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4428f0a9-5fdc-4bdb-931e-da67fb9498c9","Type":"ContainerStarted","Data":"5a23957fa981e06933df72f1bb48cabd3f01029020258904eed5bd94e0c03126"} Mar 10 14:23:49 crc kubenswrapper[4911]: I0310 14:23:49.739961 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:23:49 crc kubenswrapper[4911]: I0310 14:23:49.976027 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78dcb5b94f-bjgh8"] Mar 10 14:23:49 crc kubenswrapper[4911]: I0310 14:23:49.978338 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:49 crc kubenswrapper[4911]: I0310 14:23:49.984690 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 14:23:49 crc kubenswrapper[4911]: I0310 14:23:49.984979 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.003021 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78dcb5b94f-bjgh8"] Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.079597 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9rpg\" (UniqueName: \"kubernetes.io/projected/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-kube-api-access-v9rpg\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.079669 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-internal-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.079693 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-ovndb-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.079760 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-config\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.079819 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-httpd-config\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.079855 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-combined-ca-bundle\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.079902 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-public-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.181920 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9rpg\" (UniqueName: \"kubernetes.io/projected/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-kube-api-access-v9rpg\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.182004 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-internal-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.182047 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-ovndb-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.182099 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-config\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.182177 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-httpd-config\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.182219 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-combined-ca-bundle\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.182277 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-public-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.189999 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-public-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.190611 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-config\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.191289 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-combined-ca-bundle\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.192524 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-ovndb-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.194280 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-internal-tls-certs\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.196433 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-httpd-config\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.204586 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9rpg\" (UniqueName: \"kubernetes.io/projected/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-kube-api-access-v9rpg\") pod \"neutron-78dcb5b94f-bjgh8\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.229063 4911 generic.go:334] "Generic (PLEG): container finished" podID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" containerID="d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1" exitCode=0 Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.229143 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" event={"ID":"7a0a0bdc-e60b-4a14-b244-49619bde6bd6","Type":"ContainerDied","Data":"d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1"} Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.238272 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerStarted","Data":"5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46"} Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.245985 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf9dfcf44-sls8z" event={"ID":"4b96f0e8-39e3-4018-bb9e-75b83d333bd4","Type":"ContainerStarted","Data":"401715d98a839e18293dc3f8c3e947d7a73c27c3071b88d923203d15ef2cca0b"} Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.246045 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf9dfcf44-sls8z" event={"ID":"4b96f0e8-39e3-4018-bb9e-75b83d333bd4","Type":"ContainerStarted","Data":"95be14fb3901116d3d2f49bbf6cd4ceb984aa453ad533d64f331c09118428d98"} Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.246056 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf9dfcf44-sls8z" event={"ID":"4b96f0e8-39e3-4018-bb9e-75b83d333bd4","Type":"ContainerStarted","Data":"1b348239954ffbb80979fd166d9a271796e64ea393716e0634cf49c0a534e6e8"} Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.246977 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.271699 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4428f0a9-5fdc-4bdb-931e-da67fb9498c9","Type":"ContainerStarted","Data":"dd9670e57c5cef170cc3de9608ba90e388a6380241c3fe1a1378fdc2169ceb0b"} Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.289333 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b58550bb-f4f6-40ef-9424-7bfd05d57c9d","Type":"ContainerStarted","Data":"d0a2307df180200bd46740c366eaa709060005f76fa90802cb4fb9cd917ba67b"} Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.303533 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cf9dfcf44-sls8z" podStartSLOduration=3.3035110469999998 podStartE2EDuration="3.303511047s" podCreationTimestamp="2026-03-10 14:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:50.290393144 +0000 UTC m=+1334.853913061" watchObservedRunningTime="2026-03-10 14:23:50.303511047 +0000 UTC m=+1334.867030964" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.304302 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.353814 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.353710248 podStartE2EDuration="5.353710248s" podCreationTimestamp="2026-03-10 14:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:50.352597619 +0000 UTC m=+1334.916117536" watchObservedRunningTime="2026-03-10 14:23:50.353710248 +0000 UTC m=+1334.917230165" Mar 10 14:23:50 crc kubenswrapper[4911]: I0310 14:23:50.367115 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=18.367085698 podStartE2EDuration="18.367085698s" podCreationTimestamp="2026-03-10 14:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:50.32429893 +0000 UTC m=+1334.887818847" watchObservedRunningTime="2026-03-10 14:23:50.367085698 +0000 UTC m=+1334.930605615" Mar 10 14:23:51 crc kubenswrapper[4911]: I0310 14:23:51.001952 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78dcb5b94f-bjgh8"] Mar 10 14:23:51 crc kubenswrapper[4911]: W0310 14:23:51.009541 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6add0c8c_ea9f_4fc7_87ad_7c58e8b4a303.slice/crio-b91eb1af87fc3b56e6bcef353c9517fb08927f743e4a72a9b79e8eb6f672f0b0 WatchSource:0}: Error finding container b91eb1af87fc3b56e6bcef353c9517fb08927f743e4a72a9b79e8eb6f672f0b0: Status 404 returned error can't find the container with id b91eb1af87fc3b56e6bcef353c9517fb08927f743e4a72a9b79e8eb6f672f0b0 Mar 10 14:23:51 crc kubenswrapper[4911]: I0310 14:23:51.305549 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78dcb5b94f-bjgh8" event={"ID":"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303","Type":"ContainerStarted","Data":"b91eb1af87fc3b56e6bcef353c9517fb08927f743e4a72a9b79e8eb6f672f0b0"} Mar 10 14:23:51 crc kubenswrapper[4911]: I0310 14:23:51.310310 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" event={"ID":"7a0a0bdc-e60b-4a14-b244-49619bde6bd6","Type":"ContainerStarted","Data":"bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901"} Mar 10 14:23:51 crc kubenswrapper[4911]: I0310 14:23:51.333452 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" podStartSLOduration=4.333429171 podStartE2EDuration="4.333429171s" podCreationTimestamp="2026-03-10 14:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:51.327131816 +0000 UTC m=+1335.890651733" watchObservedRunningTime="2026-03-10 14:23:51.333429171 +0000 UTC m=+1335.896949088" Mar 10 14:23:52 crc kubenswrapper[4911]: I0310 14:23:52.357512 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78dcb5b94f-bjgh8" event={"ID":"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303","Type":"ContainerStarted","Data":"94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea"} Mar 10 14:23:52 crc kubenswrapper[4911]: I0310 14:23:52.357927 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:52 crc kubenswrapper[4911]: I0310 14:23:52.357971 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:23:52 crc kubenswrapper[4911]: I0310 14:23:52.357981 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78dcb5b94f-bjgh8" event={"ID":"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303","Type":"ContainerStarted","Data":"4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122"} Mar 10 14:23:52 crc kubenswrapper[4911]: I0310 14:23:52.389556 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78dcb5b94f-bjgh8" podStartSLOduration=3.389462378 podStartE2EDuration="3.389462378s" podCreationTimestamp="2026-03-10 14:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:23:52.379364254 +0000 UTC m=+1336.942884171" watchObservedRunningTime="2026-03-10 14:23:52.389462378 +0000 UTC m=+1336.952982295" Mar 10 14:23:53 crc kubenswrapper[4911]: I0310 14:23:53.369093 4911 generic.go:334] "Generic (PLEG): container finished" podID="6f554253-d7fc-4ca5-8f4b-fafd051fbee4" containerID="a7685173559e3dc24a03bb6e10a53a23002da938421bb4502acd6723ddc4e6d2" exitCode=0 Mar 10 14:23:53 crc kubenswrapper[4911]: I0310 14:23:53.369177 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9w5xj" event={"ID":"6f554253-d7fc-4ca5-8f4b-fafd051fbee4","Type":"ContainerDied","Data":"a7685173559e3dc24a03bb6e10a53a23002da938421bb4502acd6723ddc4e6d2"} Mar 10 14:23:53 crc kubenswrapper[4911]: I0310 14:23:53.508777 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 14:23:53 crc kubenswrapper[4911]: I0310 14:23:53.509220 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 14:23:53 crc kubenswrapper[4911]: I0310 14:23:53.565586 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 14:23:53 crc kubenswrapper[4911]: I0310 14:23:53.599071 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 14:23:54 crc kubenswrapper[4911]: I0310 14:23:54.379819 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 14:23:54 crc kubenswrapper[4911]: I0310 14:23:54.380118 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 14:23:55 crc kubenswrapper[4911]: I0310 14:23:55.702983 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:55 crc kubenswrapper[4911]: I0310 14:23:55.703664 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:55 crc kubenswrapper[4911]: I0310 14:23:55.751425 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:55 crc kubenswrapper[4911]: I0310 14:23:55.757258 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:55 crc kubenswrapper[4911]: I0310 14:23:55.853299 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:55 crc kubenswrapper[4911]: I0310 14:23:55.853363 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:23:56 crc kubenswrapper[4911]: I0310 14:23:56.166593 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:56 crc kubenswrapper[4911]: I0310 14:23:56.167046 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:23:56 crc kubenswrapper[4911]: I0310 14:23:56.408655 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:23:56 crc kubenswrapper[4911]: I0310 14:23:56.408682 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:23:56 crc kubenswrapper[4911]: I0310 14:23:56.409537 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:56 crc kubenswrapper[4911]: I0310 14:23:56.409597 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:56 crc kubenswrapper[4911]: I0310 14:23:56.652360 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 14:23:56 crc kubenswrapper[4911]: I0310 14:23:56.654641 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 14:23:57 crc kubenswrapper[4911]: I0310 14:23:57.865994 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:23:57 crc kubenswrapper[4911]: I0310 14:23:57.942105 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-8jdnn"] Mar 10 14:23:57 crc kubenswrapper[4911]: I0310 14:23:57.942779 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" podUID="44517f61-1ece-44c0-8831-4a3e5d188c0f" containerName="dnsmasq-dns" containerID="cri-o://5075f9aa56253f4c1187726c7b20ff42243972d7195226fb17d3d88487655ace" gracePeriod=10 Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.305373 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.377448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-scripts\") pod \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.377777 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-combined-ca-bundle\") pod \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.377824 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-config-data\") pod \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.377853 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p82s\" (UniqueName: \"kubernetes.io/projected/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-kube-api-access-5p82s\") pod \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.377877 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-fernet-keys\") pod \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.377913 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-credential-keys\") pod \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\" (UID: \"6f554253-d7fc-4ca5-8f4b-fafd051fbee4\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.389158 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6f554253-d7fc-4ca5-8f4b-fafd051fbee4" (UID: "6f554253-d7fc-4ca5-8f4b-fafd051fbee4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.431956 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6f554253-d7fc-4ca5-8f4b-fafd051fbee4" (UID: "6f554253-d7fc-4ca5-8f4b-fafd051fbee4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.432084 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-scripts" (OuterVolumeSpecName: "scripts") pod "6f554253-d7fc-4ca5-8f4b-fafd051fbee4" (UID: "6f554253-d7fc-4ca5-8f4b-fafd051fbee4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.451097 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-kube-api-access-5p82s" (OuterVolumeSpecName: "kube-api-access-5p82s") pod "6f554253-d7fc-4ca5-8f4b-fafd051fbee4" (UID: "6f554253-d7fc-4ca5-8f4b-fafd051fbee4"). InnerVolumeSpecName "kube-api-access-5p82s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.459994 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9w5xj" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.460235 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9w5xj" event={"ID":"6f554253-d7fc-4ca5-8f4b-fafd051fbee4","Type":"ContainerDied","Data":"fed1386d7badb8d274a2aab0ca439a2c0922ac8aee32db185ddca43b2e7a42cd"} Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.460279 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed1386d7badb8d274a2aab0ca439a2c0922ac8aee32db185ddca43b2e7a42cd" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.476425 4911 generic.go:334] "Generic (PLEG): container finished" podID="44517f61-1ece-44c0-8831-4a3e5d188c0f" containerID="5075f9aa56253f4c1187726c7b20ff42243972d7195226fb17d3d88487655ace" exitCode=0 Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.477125 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" event={"ID":"44517f61-1ece-44c0-8831-4a3e5d188c0f","Type":"ContainerDied","Data":"5075f9aa56253f4c1187726c7b20ff42243972d7195226fb17d3d88487655ace"} Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.481290 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p82s\" (UniqueName: \"kubernetes.io/projected/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-kube-api-access-5p82s\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.481326 4911 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.481339 4911 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.481347 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.483118 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f554253-d7fc-4ca5-8f4b-fafd051fbee4" (UID: "6f554253-d7fc-4ca5-8f4b-fafd051fbee4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.492738 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-config-data" (OuterVolumeSpecName: "config-data") pod "6f554253-d7fc-4ca5-8f4b-fafd051fbee4" (UID: "6f554253-d7fc-4ca5-8f4b-fafd051fbee4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.523063 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.583020 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-sb\") pod \"44517f61-1ece-44c0-8831-4a3e5d188c0f\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.583152 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-nb\") pod \"44517f61-1ece-44c0-8831-4a3e5d188c0f\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.583257 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzbxd\" (UniqueName: \"kubernetes.io/projected/44517f61-1ece-44c0-8831-4a3e5d188c0f-kube-api-access-wzbxd\") pod \"44517f61-1ece-44c0-8831-4a3e5d188c0f\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.583345 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-svc\") pod \"44517f61-1ece-44c0-8831-4a3e5d188c0f\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.583437 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-swift-storage-0\") pod \"44517f61-1ece-44c0-8831-4a3e5d188c0f\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.583680 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-config\") pod \"44517f61-1ece-44c0-8831-4a3e5d188c0f\" (UID: \"44517f61-1ece-44c0-8831-4a3e5d188c0f\") " Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.584588 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.584606 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f554253-d7fc-4ca5-8f4b-fafd051fbee4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.608289 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44517f61-1ece-44c0-8831-4a3e5d188c0f-kube-api-access-wzbxd" (OuterVolumeSpecName: "kube-api-access-wzbxd") pod "44517f61-1ece-44c0-8831-4a3e5d188c0f" (UID: "44517f61-1ece-44c0-8831-4a3e5d188c0f"). InnerVolumeSpecName "kube-api-access-wzbxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.689771 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzbxd\" (UniqueName: \"kubernetes.io/projected/44517f61-1ece-44c0-8831-4a3e5d188c0f-kube-api-access-wzbxd\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.702773 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "44517f61-1ece-44c0-8831-4a3e5d188c0f" (UID: "44517f61-1ece-44c0-8831-4a3e5d188c0f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.709536 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "44517f61-1ece-44c0-8831-4a3e5d188c0f" (UID: "44517f61-1ece-44c0-8831-4a3e5d188c0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.714936 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44517f61-1ece-44c0-8831-4a3e5d188c0f" (UID: "44517f61-1ece-44c0-8831-4a3e5d188c0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.719111 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-config" (OuterVolumeSpecName: "config") pod "44517f61-1ece-44c0-8831-4a3e5d188c0f" (UID: "44517f61-1ece-44c0-8831-4a3e5d188c0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.744299 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44517f61-1ece-44c0-8831-4a3e5d188c0f" (UID: "44517f61-1ece-44c0-8831-4a3e5d188c0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.794054 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.794090 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.794105 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.794119 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:58 crc kubenswrapper[4911]: I0310 14:23:58.794128 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44517f61-1ece-44c0-8831-4a3e5d188c0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.452390 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76d846bbc6-4wr5p"] Mar 10 14:23:59 crc kubenswrapper[4911]: E0310 14:23:59.452824 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44517f61-1ece-44c0-8831-4a3e5d188c0f" containerName="dnsmasq-dns" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.452843 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="44517f61-1ece-44c0-8831-4a3e5d188c0f" containerName="dnsmasq-dns" Mar 10 14:23:59 crc kubenswrapper[4911]: E0310 14:23:59.452865 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44517f61-1ece-44c0-8831-4a3e5d188c0f" containerName="init" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.452872 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="44517f61-1ece-44c0-8831-4a3e5d188c0f" containerName="init" Mar 10 14:23:59 crc kubenswrapper[4911]: E0310 14:23:59.452899 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f554253-d7fc-4ca5-8f4b-fafd051fbee4" containerName="keystone-bootstrap" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.452906 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f554253-d7fc-4ca5-8f4b-fafd051fbee4" containerName="keystone-bootstrap" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.453109 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="44517f61-1ece-44c0-8831-4a3e5d188c0f" containerName="dnsmasq-dns" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.453144 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f554253-d7fc-4ca5-8f4b-fafd051fbee4" containerName="keystone-bootstrap" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.453822 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.461497 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.473144 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.473401 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.473600 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.473781 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.473941 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4vwm6" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.502445 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76d846bbc6-4wr5p"] Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508007 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zzrw\" (UniqueName: \"kubernetes.io/projected/3370ba4c-d284-4d51-8b2d-d1da50950def-kube-api-access-8zzrw\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508095 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-public-tls-certs\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508120 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-scripts\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508159 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-combined-ca-bundle\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508176 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-config-data\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508195 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-fernet-keys\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508292 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-credential-keys\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508567 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-internal-tls-certs\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.508772 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerStarted","Data":"1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176"} Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.522918 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" event={"ID":"44517f61-1ece-44c0-8831-4a3e5d188c0f","Type":"ContainerDied","Data":"f586471956fb5344c198b8def8f98a837ce06ecb6d9637902609d480fec1343e"} Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.522991 4911 scope.go:117] "RemoveContainer" containerID="5075f9aa56253f4c1187726c7b20ff42243972d7195226fb17d3d88487655ace" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.523184 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-8jdnn" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.533335 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2s85z" event={"ID":"86c9857f-704f-48ce-b90b-9275e9eba41a","Type":"ContainerStarted","Data":"b565c79511677b5104e61adf04d1b88b0ff1a7e493a7f6e74fe5f320924efb13"} Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.573073 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2s85z" podStartSLOduration=3.408524156 podStartE2EDuration="42.573044935s" podCreationTimestamp="2026-03-10 14:23:17 +0000 UTC" firstStartedPulling="2026-03-10 14:23:18.843317348 +0000 UTC m=+1303.406837265" lastFinishedPulling="2026-03-10 14:23:58.007838127 +0000 UTC m=+1342.571358044" observedRunningTime="2026-03-10 14:23:59.568704562 +0000 UTC m=+1344.132224479" watchObservedRunningTime="2026-03-10 14:23:59.573044935 +0000 UTC m=+1344.136564852" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.584144 4911 scope.go:117] "RemoveContainer" containerID="525e76ed2e8269372d610c9c40a8d29d78fbf71de09527d59813ce36f53b7785" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.604388 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-8jdnn"] Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.612074 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-public-tls-certs\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.612113 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-scripts\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.612168 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-combined-ca-bundle\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.612193 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-config-data\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.612218 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-fernet-keys\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.612468 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-credential-keys\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.612521 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-internal-tls-certs\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.612577 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zzrw\" (UniqueName: \"kubernetes.io/projected/3370ba4c-d284-4d51-8b2d-d1da50950def-kube-api-access-8zzrw\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.617865 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-8jdnn"] Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.623772 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-fernet-keys\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.628228 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-scripts\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.633303 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-public-tls-certs\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.647832 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-internal-tls-certs\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.648452 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-credential-keys\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.652588 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-combined-ca-bundle\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.654499 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3370ba4c-d284-4d51-8b2d-d1da50950def-config-data\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.666539 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zzrw\" (UniqueName: \"kubernetes.io/projected/3370ba4c-d284-4d51-8b2d-d1da50950def-kube-api-access-8zzrw\") pod \"keystone-76d846bbc6-4wr5p\" (UID: \"3370ba4c-d284-4d51-8b2d-d1da50950def\") " pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.784495 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.860936 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.861087 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:23:59 crc kubenswrapper[4911]: I0310 14:23:59.863368 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.148772 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552544-2kc8q"] Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.153640 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552544-2kc8q"] Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.153773 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.163467 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.163796 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.164343 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.220632 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44517f61-1ece-44c0-8831-4a3e5d188c0f" path="/var/lib/kubelet/pods/44517f61-1ece-44c0-8831-4a3e5d188c0f/volumes" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.343562 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb6ft\" (UniqueName: \"kubernetes.io/projected/b31175aa-9ebc-4576-a218-b9d926c1c559-kube-api-access-cb6ft\") pod \"auto-csr-approver-29552544-2kc8q\" (UID: \"b31175aa-9ebc-4576-a218-b9d926c1c559\") " pod="openshift-infra/auto-csr-approver-29552544-2kc8q" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.455327 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb6ft\" (UniqueName: \"kubernetes.io/projected/b31175aa-9ebc-4576-a218-b9d926c1c559-kube-api-access-cb6ft\") pod \"auto-csr-approver-29552544-2kc8q\" (UID: \"b31175aa-9ebc-4576-a218-b9d926c1c559\") " pod="openshift-infra/auto-csr-approver-29552544-2kc8q" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.501023 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb6ft\" (UniqueName: \"kubernetes.io/projected/b31175aa-9ebc-4576-a218-b9d926c1c559-kube-api-access-cb6ft\") pod \"auto-csr-approver-29552544-2kc8q\" (UID: \"b31175aa-9ebc-4576-a218-b9d926c1c559\") " pod="openshift-infra/auto-csr-approver-29552544-2kc8q" Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.523536 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76d846bbc6-4wr5p"] Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.566205 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76d846bbc6-4wr5p" event={"ID":"3370ba4c-d284-4d51-8b2d-d1da50950def","Type":"ContainerStarted","Data":"41cdbd64ffe7c11e872fc5e7afeb9fe404ddf23327cd882ba06ef392e91f4c15"} Mar 10 14:24:00 crc kubenswrapper[4911]: I0310 14:24:00.790037 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" Mar 10 14:24:01 crc kubenswrapper[4911]: I0310 14:24:01.332684 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552544-2kc8q"] Mar 10 14:24:01 crc kubenswrapper[4911]: I0310 14:24:01.587104 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76d846bbc6-4wr5p" event={"ID":"3370ba4c-d284-4d51-8b2d-d1da50950def","Type":"ContainerStarted","Data":"793e179dc0cd204b6ae09f3591d85301a69624fc8ee3e2a474ff8f39c7e7045c"} Mar 10 14:24:01 crc kubenswrapper[4911]: I0310 14:24:01.587305 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:24:01 crc kubenswrapper[4911]: I0310 14:24:01.605200 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" event={"ID":"b31175aa-9ebc-4576-a218-b9d926c1c559","Type":"ContainerStarted","Data":"08485fc4781600632058de451bafb2192306aed96e45e8858512eef094f08e85"} Mar 10 14:24:01 crc kubenswrapper[4911]: I0310 14:24:01.609346 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-85swm" event={"ID":"67b878c7-d1cf-4656-8762-7be57cf1491a","Type":"ContainerStarted","Data":"88605f944fe9db9675ed52044c08972beadbb1e0b2b3b2f7ee7dec705d09982f"} Mar 10 14:24:01 crc kubenswrapper[4911]: I0310 14:24:01.627999 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76d846bbc6-4wr5p" podStartSLOduration=2.627961066 podStartE2EDuration="2.627961066s" podCreationTimestamp="2026-03-10 14:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:01.61050245 +0000 UTC m=+1346.174022367" watchObservedRunningTime="2026-03-10 14:24:01.627961066 +0000 UTC m=+1346.191480983" Mar 10 14:24:01 crc kubenswrapper[4911]: I0310 14:24:01.666251 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-85swm" podStartSLOduration=3.319832575 podStartE2EDuration="45.666226255s" podCreationTimestamp="2026-03-10 14:23:16 +0000 UTC" firstStartedPulling="2026-03-10 14:23:18.336217801 +0000 UTC m=+1302.899737718" lastFinishedPulling="2026-03-10 14:24:00.682611481 +0000 UTC m=+1345.246131398" observedRunningTime="2026-03-10 14:24:01.645572686 +0000 UTC m=+1346.209092603" watchObservedRunningTime="2026-03-10 14:24:01.666226255 +0000 UTC m=+1346.229746172" Mar 10 14:24:02 crc kubenswrapper[4911]: I0310 14:24:02.629678 4911 generic.go:334] "Generic (PLEG): container finished" podID="86c9857f-704f-48ce-b90b-9275e9eba41a" containerID="b565c79511677b5104e61adf04d1b88b0ff1a7e493a7f6e74fe5f320924efb13" exitCode=0 Mar 10 14:24:02 crc kubenswrapper[4911]: I0310 14:24:02.629810 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2s85z" event={"ID":"86c9857f-704f-48ce-b90b-9275e9eba41a","Type":"ContainerDied","Data":"b565c79511677b5104e61adf04d1b88b0ff1a7e493a7f6e74fe5f320924efb13"} Mar 10 14:24:03 crc kubenswrapper[4911]: I0310 14:24:03.647254 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k62d9" event={"ID":"9942f116-fd81-4e92-bd0f-add9b12b4c08","Type":"ContainerStarted","Data":"9061b4e578dbba907356514f9fc9ffcb1738d5dff660b5b603fe293f8737ed79"} Mar 10 14:24:03 crc kubenswrapper[4911]: I0310 14:24:03.650155 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" event={"ID":"b31175aa-9ebc-4576-a218-b9d926c1c559","Type":"ContainerStarted","Data":"bc04872203b2bef81c1fb176971693f1ccf0e54cdd64ce5625bab7414127f73c"} Mar 10 14:24:03 crc kubenswrapper[4911]: I0310 14:24:03.677823 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-k62d9" podStartSLOduration=3.878653214 podStartE2EDuration="47.677795114s" podCreationTimestamp="2026-03-10 14:23:16 +0000 UTC" firstStartedPulling="2026-03-10 14:23:18.382384597 +0000 UTC m=+1302.945904514" lastFinishedPulling="2026-03-10 14:24:02.181526497 +0000 UTC m=+1346.745046414" observedRunningTime="2026-03-10 14:24:03.673175674 +0000 UTC m=+1348.236695591" watchObservedRunningTime="2026-03-10 14:24:03.677795114 +0000 UTC m=+1348.241315031" Mar 10 14:24:03 crc kubenswrapper[4911]: I0310 14:24:03.695450 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" podStartSLOduration=1.8509828210000001 podStartE2EDuration="3.695425215s" podCreationTimestamp="2026-03-10 14:24:00 +0000 UTC" firstStartedPulling="2026-03-10 14:24:01.34984491 +0000 UTC m=+1345.913364817" lastFinishedPulling="2026-03-10 14:24:03.194287294 +0000 UTC m=+1347.757807211" observedRunningTime="2026-03-10 14:24:03.695148878 +0000 UTC m=+1348.258668795" watchObservedRunningTime="2026-03-10 14:24:03.695425215 +0000 UTC m=+1348.258945132" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.212196 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2s85z" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.383822 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c9857f-704f-48ce-b90b-9275e9eba41a-logs\") pod \"86c9857f-704f-48ce-b90b-9275e9eba41a\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.383951 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-scripts\") pod \"86c9857f-704f-48ce-b90b-9275e9eba41a\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.384057 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-combined-ca-bundle\") pod \"86c9857f-704f-48ce-b90b-9275e9eba41a\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.384147 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-config-data\") pod \"86c9857f-704f-48ce-b90b-9275e9eba41a\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.384200 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbjgr\" (UniqueName: \"kubernetes.io/projected/86c9857f-704f-48ce-b90b-9275e9eba41a-kube-api-access-pbjgr\") pod \"86c9857f-704f-48ce-b90b-9275e9eba41a\" (UID: \"86c9857f-704f-48ce-b90b-9275e9eba41a\") " Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.385559 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c9857f-704f-48ce-b90b-9275e9eba41a-logs" (OuterVolumeSpecName: "logs") pod "86c9857f-704f-48ce-b90b-9275e9eba41a" (UID: "86c9857f-704f-48ce-b90b-9275e9eba41a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.393666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-scripts" (OuterVolumeSpecName: "scripts") pod "86c9857f-704f-48ce-b90b-9275e9eba41a" (UID: "86c9857f-704f-48ce-b90b-9275e9eba41a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.393880 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c9857f-704f-48ce-b90b-9275e9eba41a-kube-api-access-pbjgr" (OuterVolumeSpecName: "kube-api-access-pbjgr") pod "86c9857f-704f-48ce-b90b-9275e9eba41a" (UID: "86c9857f-704f-48ce-b90b-9275e9eba41a"). InnerVolumeSpecName "kube-api-access-pbjgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.435490 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-config-data" (OuterVolumeSpecName: "config-data") pod "86c9857f-704f-48ce-b90b-9275e9eba41a" (UID: "86c9857f-704f-48ce-b90b-9275e9eba41a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.442029 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86c9857f-704f-48ce-b90b-9275e9eba41a" (UID: "86c9857f-704f-48ce-b90b-9275e9eba41a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.486480 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c9857f-704f-48ce-b90b-9275e9eba41a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.486519 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.486530 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.486540 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c9857f-704f-48ce-b90b-9275e9eba41a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.486563 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbjgr\" (UniqueName: \"kubernetes.io/projected/86c9857f-704f-48ce-b90b-9275e9eba41a-kube-api-access-pbjgr\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.687128 4911 generic.go:334] "Generic (PLEG): container finished" podID="67b878c7-d1cf-4656-8762-7be57cf1491a" containerID="88605f944fe9db9675ed52044c08972beadbb1e0b2b3b2f7ee7dec705d09982f" exitCode=0 Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.687232 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-85swm" event={"ID":"67b878c7-d1cf-4656-8762-7be57cf1491a","Type":"ContainerDied","Data":"88605f944fe9db9675ed52044c08972beadbb1e0b2b3b2f7ee7dec705d09982f"} Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.699070 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2s85z" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.699128 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2s85z" event={"ID":"86c9857f-704f-48ce-b90b-9275e9eba41a","Type":"ContainerDied","Data":"dd421ac8274f2e98374205cf457ffae84d5956847890feb5e1f733c7496a9adc"} Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.699194 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd421ac8274f2e98374205cf457ffae84d5956847890feb5e1f733c7496a9adc" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.728634 4911 generic.go:334] "Generic (PLEG): container finished" podID="b31175aa-9ebc-4576-a218-b9d926c1c559" containerID="bc04872203b2bef81c1fb176971693f1ccf0e54cdd64ce5625bab7414127f73c" exitCode=0 Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.728678 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" event={"ID":"b31175aa-9ebc-4576-a218-b9d926c1c559","Type":"ContainerDied","Data":"bc04872203b2bef81c1fb176971693f1ccf0e54cdd64ce5625bab7414127f73c"} Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.788639 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-577d67f998-s8wh9"] Mar 10 14:24:04 crc kubenswrapper[4911]: E0310 14:24:04.789183 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c9857f-704f-48ce-b90b-9275e9eba41a" containerName="placement-db-sync" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.789208 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c9857f-704f-48ce-b90b-9275e9eba41a" containerName="placement-db-sync" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.789442 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c9857f-704f-48ce-b90b-9275e9eba41a" containerName="placement-db-sync" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.792052 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.797231 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.797337 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.797349 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6z6nd" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.810144 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.810374 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.811153 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-577d67f998-s8wh9"] Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.905717 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-public-tls-certs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.905845 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f385703-b741-42ec-a63e-ec5a371859de-logs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.905987 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-scripts\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.906011 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-internal-tls-certs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.906068 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-config-data\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.906127 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-combined-ca-bundle\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:04 crc kubenswrapper[4911]: I0310 14:24:04.906170 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpfh4\" (UniqueName: \"kubernetes.io/projected/6f385703-b741-42ec-a63e-ec5a371859de-kube-api-access-tpfh4\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.008146 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-scripts\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.008215 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-internal-tls-certs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.008258 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-config-data\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.008310 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-combined-ca-bundle\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.015441 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpfh4\" (UniqueName: \"kubernetes.io/projected/6f385703-b741-42ec-a63e-ec5a371859de-kube-api-access-tpfh4\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.015899 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-public-tls-certs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.017100 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f385703-b741-42ec-a63e-ec5a371859de-logs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.016016 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f385703-b741-42ec-a63e-ec5a371859de-logs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.019649 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-combined-ca-bundle\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.032610 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-public-tls-certs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.032714 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-internal-tls-certs\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.033172 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-config-data\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.036367 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f385703-b741-42ec-a63e-ec5a371859de-scripts\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.037592 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpfh4\" (UniqueName: \"kubernetes.io/projected/6f385703-b741-42ec-a63e-ec5a371859de-kube-api-access-tpfh4\") pod \"placement-577d67f998-s8wh9\" (UID: \"6f385703-b741-42ec-a63e-ec5a371859de\") " pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.139275 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.721476 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-577d67f998-s8wh9"] Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.750647 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-577d67f998-s8wh9" event={"ID":"6f385703-b741-42ec-a63e-ec5a371859de","Type":"ContainerStarted","Data":"7cf5db2c9760b00bdf20055afe119e6712990b223960aec2995281401ae32c90"} Mar 10 14:24:05 crc kubenswrapper[4911]: I0310 14:24:05.857369 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b4dd68964-gfvp8" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.081579 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-85swm" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.141448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-db-sync-config-data\") pod \"67b878c7-d1cf-4656-8762-7be57cf1491a\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.141528 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-combined-ca-bundle\") pod \"67b878c7-d1cf-4656-8762-7be57cf1491a\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.141691 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bhpq\" (UniqueName: \"kubernetes.io/projected/67b878c7-d1cf-4656-8762-7be57cf1491a-kube-api-access-2bhpq\") pod \"67b878c7-d1cf-4656-8762-7be57cf1491a\" (UID: \"67b878c7-d1cf-4656-8762-7be57cf1491a\") " Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.147583 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "67b878c7-d1cf-4656-8762-7be57cf1491a" (UID: "67b878c7-d1cf-4656-8762-7be57cf1491a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.152137 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b878c7-d1cf-4656-8762-7be57cf1491a-kube-api-access-2bhpq" (OuterVolumeSpecName: "kube-api-access-2bhpq") pod "67b878c7-d1cf-4656-8762-7be57cf1491a" (UID: "67b878c7-d1cf-4656-8762-7be57cf1491a"). InnerVolumeSpecName "kube-api-access-2bhpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.193039 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54d884b5d4-lsz26" podUID="6be9e57d-52b9-4de2-9201-1b85feda712c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.219837 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67b878c7-d1cf-4656-8762-7be57cf1491a" (UID: "67b878c7-d1cf-4656-8762-7be57cf1491a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.245070 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bhpq\" (UniqueName: \"kubernetes.io/projected/67b878c7-d1cf-4656-8762-7be57cf1491a-kube-api-access-2bhpq\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.245662 4911 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.245674 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b878c7-d1cf-4656-8762-7be57cf1491a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.325400 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.451628 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb6ft\" (UniqueName: \"kubernetes.io/projected/b31175aa-9ebc-4576-a218-b9d926c1c559-kube-api-access-cb6ft\") pod \"b31175aa-9ebc-4576-a218-b9d926c1c559\" (UID: \"b31175aa-9ebc-4576-a218-b9d926c1c559\") " Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.459984 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31175aa-9ebc-4576-a218-b9d926c1c559-kube-api-access-cb6ft" (OuterVolumeSpecName: "kube-api-access-cb6ft") pod "b31175aa-9ebc-4576-a218-b9d926c1c559" (UID: "b31175aa-9ebc-4576-a218-b9d926c1c559"). InnerVolumeSpecName "kube-api-access-cb6ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.554859 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb6ft\" (UniqueName: \"kubernetes.io/projected/b31175aa-9ebc-4576-a218-b9d926c1c559-kube-api-access-cb6ft\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.810442 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-577d67f998-s8wh9" event={"ID":"6f385703-b741-42ec-a63e-ec5a371859de","Type":"ContainerStarted","Data":"02e24024f8e5c0e9797c6945466de086e3b240b2c98775c46d9654c63698e334"} Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.810535 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-577d67f998-s8wh9" event={"ID":"6f385703-b741-42ec-a63e-ec5a371859de","Type":"ContainerStarted","Data":"6e49f55368352547ab1c98893514ed76b9f07a465b75072331d29dcf8a6fa87d"} Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.810603 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.810629 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.816071 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.816846 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552544-2kc8q" event={"ID":"b31175aa-9ebc-4576-a218-b9d926c1c559","Type":"ContainerDied","Data":"08485fc4781600632058de451bafb2192306aed96e45e8858512eef094f08e85"} Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.816891 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08485fc4781600632058de451bafb2192306aed96e45e8858512eef094f08e85" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.824641 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552538-v5wxw"] Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.826004 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-85swm" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.825751 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-85swm" event={"ID":"67b878c7-d1cf-4656-8762-7be57cf1491a","Type":"ContainerDied","Data":"8a5ff338a4c1800c1f12ee0b62ffe0ebd429be69d9ae20da1a08b406bea7166c"} Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.828133 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5ff338a4c1800c1f12ee0b62ffe0ebd429be69d9ae20da1a08b406bea7166c" Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.846655 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552538-v5wxw"] Mar 10 14:24:06 crc kubenswrapper[4911]: I0310 14:24:06.847145 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-577d67f998-s8wh9" podStartSLOduration=2.847120467 podStartE2EDuration="2.847120467s" podCreationTimestamp="2026-03-10 14:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:06.837527426 +0000 UTC m=+1351.401047343" watchObservedRunningTime="2026-03-10 14:24:06.847120467 +0000 UTC m=+1351.410640384" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.018273 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8599db9-k9r6m"] Mar 10 14:24:07 crc kubenswrapper[4911]: E0310 14:24:07.018743 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b878c7-d1cf-4656-8762-7be57cf1491a" containerName="barbican-db-sync" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.018764 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b878c7-d1cf-4656-8762-7be57cf1491a" containerName="barbican-db-sync" Mar 10 14:24:07 crc kubenswrapper[4911]: E0310 14:24:07.018800 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31175aa-9ebc-4576-a218-b9d926c1c559" containerName="oc" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.018807 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31175aa-9ebc-4576-a218-b9d926c1c559" containerName="oc" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.019044 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31175aa-9ebc-4576-a218-b9d926c1c559" containerName="oc" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.019072 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b878c7-d1cf-4656-8762-7be57cf1491a" containerName="barbican-db-sync" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.022989 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.035614 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.035680 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t4gl8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.036119 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.053134 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8599db9-k9r6m"] Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.171414 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7lw4\" (UniqueName: \"kubernetes.io/projected/c049ccee-c503-43b1-b263-c6ee453e93e0-kube-api-access-q7lw4\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.172102 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-combined-ca-bundle\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.172147 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c049ccee-c503-43b1-b263-c6ee453e93e0-logs\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.172200 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-config-data-custom\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.172230 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-config-data\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.196794 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8f48b4f88-jjg7s"] Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.198802 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.205083 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.226811 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-txf9k"] Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.228784 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.229331 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8f48b4f88-jjg7s"] Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.237307 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-txf9k"] Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283350 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-combined-ca-bundle\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283445 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg56h\" (UniqueName: \"kubernetes.io/projected/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-kube-api-access-gg56h\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283489 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-config-data-custom\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283508 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-logs\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283545 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-combined-ca-bundle\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283580 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c049ccee-c503-43b1-b263-c6ee453e93e0-logs\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283607 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-config-data\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283637 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-config-data-custom\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283669 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-config-data\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.283743 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7lw4\" (UniqueName: \"kubernetes.io/projected/c049ccee-c503-43b1-b263-c6ee453e93e0-kube-api-access-q7lw4\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.285120 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c049ccee-c503-43b1-b263-c6ee453e93e0-logs\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.298785 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-combined-ca-bundle\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.306163 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-config-data-custom\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.318214 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c049ccee-c503-43b1-b263-c6ee453e93e0-config-data\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.350124 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7lw4\" (UniqueName: \"kubernetes.io/projected/c049ccee-c503-43b1-b263-c6ee453e93e0-kube-api-access-q7lw4\") pod \"barbican-worker-8599db9-k9r6m\" (UID: \"c049ccee-c503-43b1-b263-c6ee453e93e0\") " pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.354314 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8599db9-k9r6m" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389493 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg56h\" (UniqueName: \"kubernetes.io/projected/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-kube-api-access-gg56h\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389553 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-config-data-custom\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389575 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-logs\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389629 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-config-data\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389656 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389690 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389718 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x9bm\" (UniqueName: \"kubernetes.io/projected/28a4dbd7-7594-4787-a138-952c605462c2-kube-api-access-9x9bm\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389756 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-config\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389780 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-svc\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389799 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.389824 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-combined-ca-bundle\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.400297 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-logs\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.407521 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-config-data-custom\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.408112 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-combined-ca-bundle\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.424384 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-config-data\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.462338 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg56h\" (UniqueName: \"kubernetes.io/projected/d66c76f8-6b9a-40d3-b5fc-d2d5790928f6-kube-api-access-gg56h\") pod \"barbican-keystone-listener-8f48b4f88-jjg7s\" (UID: \"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6\") " pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.491402 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.491492 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.491522 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x9bm\" (UniqueName: \"kubernetes.io/projected/28a4dbd7-7594-4787-a138-952c605462c2-kube-api-access-9x9bm\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.491548 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-config\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.491571 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-svc\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.491593 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.500089 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.509469 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-config\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.509996 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.510120 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-svc\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.510706 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.542827 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-554f4c9c94-4jmq8"] Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.545192 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.569144 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.569742 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.598139 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcptr\" (UniqueName: \"kubernetes.io/projected/aa83f400-7839-4e47-a578-73f2cad08034-kube-api-access-dcptr\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.598244 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa83f400-7839-4e47-a578-73f2cad08034-logs\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.598305 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.598358 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data-custom\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.598398 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-combined-ca-bundle\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.608987 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x9bm\" (UniqueName: \"kubernetes.io/projected/28a4dbd7-7594-4787-a138-952c605462c2-kube-api-access-9x9bm\") pod \"dnsmasq-dns-85ff748b95-txf9k\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.649837 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-554f4c9c94-4jmq8"] Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.702276 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.702759 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data-custom\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.702801 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-combined-ca-bundle\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.702904 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcptr\" (UniqueName: \"kubernetes.io/projected/aa83f400-7839-4e47-a578-73f2cad08034-kube-api-access-dcptr\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.702966 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa83f400-7839-4e47-a578-73f2cad08034-logs\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.703586 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa83f400-7839-4e47-a578-73f2cad08034-logs\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.736355 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data-custom\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.738545 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-combined-ca-bundle\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.759945 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcptr\" (UniqueName: \"kubernetes.io/projected/aa83f400-7839-4e47-a578-73f2cad08034-kube-api-access-dcptr\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.782011 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data\") pod \"barbican-api-554f4c9c94-4jmq8\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.821522 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.972178 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8599db9-k9r6m"] Mar 10 14:24:07 crc kubenswrapper[4911]: I0310 14:24:07.983829 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:08 crc kubenswrapper[4911]: I0310 14:24:08.208562 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2581a8bf-f280-4d52-a410-e1b8a019bfaa" path="/var/lib/kubelet/pods/2581a8bf-f280-4d52-a410-e1b8a019bfaa/volumes" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.478597 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fb47b4698-gx22c"] Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.480834 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.487930 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.495106 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.520165 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fb47b4698-gx22c"] Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.580259 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-config-data\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.580333 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-public-tls-certs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.580389 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-combined-ca-bundle\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.580457 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2689d664-9bf8-4c5b-8c53-353286854071-logs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.580479 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q8lk\" (UniqueName: \"kubernetes.io/projected/2689d664-9bf8-4c5b-8c53-353286854071-kube-api-access-6q8lk\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.580516 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-config-data-custom\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.580593 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-internal-tls-certs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.682658 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2689d664-9bf8-4c5b-8c53-353286854071-logs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.682717 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q8lk\" (UniqueName: \"kubernetes.io/projected/2689d664-9bf8-4c5b-8c53-353286854071-kube-api-access-6q8lk\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.682758 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-config-data-custom\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.682829 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-internal-tls-certs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.682881 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-config-data\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.682910 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-public-tls-certs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.682947 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-combined-ca-bundle\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.684094 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2689d664-9bf8-4c5b-8c53-353286854071-logs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.689964 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-config-data\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.690205 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-internal-tls-certs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.692176 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-combined-ca-bundle\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.695439 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-public-tls-certs\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.698490 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2689d664-9bf8-4c5b-8c53-353286854071-config-data-custom\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.704919 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q8lk\" (UniqueName: \"kubernetes.io/projected/2689d664-9bf8-4c5b-8c53-353286854071-kube-api-access-6q8lk\") pod \"barbican-api-6fb47b4698-gx22c\" (UID: \"2689d664-9bf8-4c5b-8c53-353286854071\") " pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.802742 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.885126 4911 generic.go:334] "Generic (PLEG): container finished" podID="9942f116-fd81-4e92-bd0f-add9b12b4c08" containerID="9061b4e578dbba907356514f9fc9ffcb1738d5dff660b5b603fe293f8737ed79" exitCode=0 Mar 10 14:24:10 crc kubenswrapper[4911]: I0310 14:24:10.885178 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k62d9" event={"ID":"9942f116-fd81-4e92-bd0f-add9b12b4c08","Type":"ContainerDied","Data":"9061b4e578dbba907356514f9fc9ffcb1738d5dff660b5b603fe293f8737ed79"} Mar 10 14:24:12 crc kubenswrapper[4911]: W0310 14:24:12.982234 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc049ccee_c503_43b1_b263_c6ee453e93e0.slice/crio-d55330df3e58b79491448a4cead5657bc8f3051cfbeb16b3cadaf58740e7498d WatchSource:0}: Error finding container d55330df3e58b79491448a4cead5657bc8f3051cfbeb16b3cadaf58740e7498d: Status 404 returned error can't find the container with id d55330df3e58b79491448a4cead5657bc8f3051cfbeb16b3cadaf58740e7498d Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.089519 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k62d9" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.158098 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-scripts\") pod \"9942f116-fd81-4e92-bd0f-add9b12b4c08\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.158465 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-combined-ca-bundle\") pod \"9942f116-fd81-4e92-bd0f-add9b12b4c08\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.159469 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9942f116-fd81-4e92-bd0f-add9b12b4c08-etc-machine-id\") pod \"9942f116-fd81-4e92-bd0f-add9b12b4c08\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.160024 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6tjk\" (UniqueName: \"kubernetes.io/projected/9942f116-fd81-4e92-bd0f-add9b12b4c08-kube-api-access-k6tjk\") pod \"9942f116-fd81-4e92-bd0f-add9b12b4c08\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.160120 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-db-sync-config-data\") pod \"9942f116-fd81-4e92-bd0f-add9b12b4c08\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.160269 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-config-data\") pod \"9942f116-fd81-4e92-bd0f-add9b12b4c08\" (UID: \"9942f116-fd81-4e92-bd0f-add9b12b4c08\") " Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.159579 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9942f116-fd81-4e92-bd0f-add9b12b4c08-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9942f116-fd81-4e92-bd0f-add9b12b4c08" (UID: "9942f116-fd81-4e92-bd0f-add9b12b4c08"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.167214 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9942f116-fd81-4e92-bd0f-add9b12b4c08" (UID: "9942f116-fd81-4e92-bd0f-add9b12b4c08"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.167369 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9942f116-fd81-4e92-bd0f-add9b12b4c08-kube-api-access-k6tjk" (OuterVolumeSpecName: "kube-api-access-k6tjk") pod "9942f116-fd81-4e92-bd0f-add9b12b4c08" (UID: "9942f116-fd81-4e92-bd0f-add9b12b4c08"). InnerVolumeSpecName "kube-api-access-k6tjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.181305 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-scripts" (OuterVolumeSpecName: "scripts") pod "9942f116-fd81-4e92-bd0f-add9b12b4c08" (UID: "9942f116-fd81-4e92-bd0f-add9b12b4c08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.204535 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9942f116-fd81-4e92-bd0f-add9b12b4c08" (UID: "9942f116-fd81-4e92-bd0f-add9b12b4c08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.221453 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-config-data" (OuterVolumeSpecName: "config-data") pod "9942f116-fd81-4e92-bd0f-add9b12b4c08" (UID: "9942f116-fd81-4e92-bd0f-add9b12b4c08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.263410 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.263449 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.263461 4911 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9942f116-fd81-4e92-bd0f-add9b12b4c08-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.263470 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6tjk\" (UniqueName: \"kubernetes.io/projected/9942f116-fd81-4e92-bd0f-add9b12b4c08-kube-api-access-k6tjk\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.263478 4911 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.263486 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9942f116-fd81-4e92-bd0f-add9b12b4c08-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.938287 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8599db9-k9r6m" event={"ID":"c049ccee-c503-43b1-b263-c6ee453e93e0","Type":"ContainerStarted","Data":"d55330df3e58b79491448a4cead5657bc8f3051cfbeb16b3cadaf58740e7498d"} Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.940143 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k62d9" event={"ID":"9942f116-fd81-4e92-bd0f-add9b12b4c08","Type":"ContainerDied","Data":"e1d584d6deab54a0744f4f1a28bd19127f2e71d21515005941a42a78de7099db"} Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.940170 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1d584d6deab54a0744f4f1a28bd19127f2e71d21515005941a42a78de7099db" Mar 10 14:24:13 crc kubenswrapper[4911]: I0310 14:24:13.940232 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k62d9" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.477381 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:14 crc kubenswrapper[4911]: E0310 14:24:14.477879 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9942f116-fd81-4e92-bd0f-add9b12b4c08" containerName="cinder-db-sync" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.477894 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9942f116-fd81-4e92-bd0f-add9b12b4c08" containerName="cinder-db-sync" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.478083 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9942f116-fd81-4e92-bd0f-add9b12b4c08" containerName="cinder-db-sync" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.481481 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.485790 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.486347 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-cg8dm" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.486494 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.486632 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.526180 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.571078 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-txf9k"] Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.601132 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2hkbh"] Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.604489 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.606136 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.606188 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.606232 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.606277 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r4q9\" (UniqueName: \"kubernetes.io/projected/8a9f0716-d281-468e-9ba9-d77699793836-kube-api-access-5r4q9\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.606320 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.606355 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a9f0716-d281-468e-9ba9-d77699793836-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.622568 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2hkbh"] Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.708896 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.708960 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709010 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a9f0716-d281-468e-9ba9-d77699793836-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709034 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709142 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709174 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709197 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709220 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709244 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-config\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709293 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709339 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4nrd\" (UniqueName: \"kubernetes.io/projected/5322d280-17b4-489e-a867-6527ce33d3f4-kube-api-access-z4nrd\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.709375 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4q9\" (UniqueName: \"kubernetes.io/projected/8a9f0716-d281-468e-9ba9-d77699793836-kube-api-access-5r4q9\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.710561 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a9f0716-d281-468e-9ba9-d77699793836-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.719849 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.730568 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.746116 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r4q9\" (UniqueName: \"kubernetes.io/projected/8a9f0716-d281-468e-9ba9-d77699793836-kube-api-access-5r4q9\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.749158 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.757011 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.814486 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.816025 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.816075 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.816094 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-config\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.816157 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4nrd\" (UniqueName: \"kubernetes.io/projected/5322d280-17b4-489e-a867-6527ce33d3f4-kube-api-access-z4nrd\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.816214 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.816243 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.817309 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.817454 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-config\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.817463 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.818111 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.822764 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.858214 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4nrd\" (UniqueName: \"kubernetes.io/projected/5322d280-17b4-489e-a867-6527ce33d3f4-kube-api-access-z4nrd\") pod \"dnsmasq-dns-5c9776ccc5-2hkbh\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.906662 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.908851 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.916463 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.929513 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:14 crc kubenswrapper[4911]: I0310 14:24:14.930213 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.020773 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f163cfe-1829-40bc-8117-4b2b1c072a91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.021101 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data-custom\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.021177 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66br7\" (UniqueName: \"kubernetes.io/projected/2f163cfe-1829-40bc-8117-4b2b1c072a91-kube-api-access-66br7\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.021233 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.021290 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.021313 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f163cfe-1829-40bc-8117-4b2b1c072a91-logs\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.021335 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-scripts\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.123299 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.123355 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f163cfe-1829-40bc-8117-4b2b1c072a91-logs\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.123388 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-scripts\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.123443 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f163cfe-1829-40bc-8117-4b2b1c072a91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.123462 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data-custom\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.123521 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66br7\" (UniqueName: \"kubernetes.io/projected/2f163cfe-1829-40bc-8117-4b2b1c072a91-kube-api-access-66br7\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.123553 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.124155 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f163cfe-1829-40bc-8117-4b2b1c072a91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.125493 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f163cfe-1829-40bc-8117-4b2b1c072a91-logs\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.130333 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-scripts\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.130715 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data-custom\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.131363 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.151201 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66br7\" (UniqueName: \"kubernetes.io/projected/2f163cfe-1829-40bc-8117-4b2b1c072a91-kube-api-access-66br7\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.152157 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data\") pod \"cinder-api-0\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " pod="openstack/cinder-api-0" Mar 10 14:24:15 crc kubenswrapper[4911]: I0310 14:24:15.230738 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 14:24:16 crc kubenswrapper[4911]: I0310 14:24:16.877716 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.005300 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerStarted","Data":"b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602"} Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.005518 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="ceilometer-central-agent" containerID="cri-o://6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615" gracePeriod=30 Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.005863 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.006178 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="proxy-httpd" containerID="cri-o://b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602" gracePeriod=30 Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.006230 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="sg-core" containerID="cri-o://1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176" gracePeriod=30 Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.006270 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="ceilometer-notification-agent" containerID="cri-o://5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46" gracePeriod=30 Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.030492 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.678089274 podStartE2EDuration="1m1.030460918s" podCreationTimestamp="2026-03-10 14:23:16 +0000 UTC" firstStartedPulling="2026-03-10 14:23:18.37830461 +0000 UTC m=+1302.941824527" lastFinishedPulling="2026-03-10 14:24:15.730676254 +0000 UTC m=+1360.294196171" observedRunningTime="2026-03-10 14:24:17.027801699 +0000 UTC m=+1361.591321616" watchObservedRunningTime="2026-03-10 14:24:17.030460918 +0000 UTC m=+1361.593980835" Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.286989 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.297863 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fb47b4698-gx22c"] Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.308845 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-txf9k"] Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.320090 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-554f4c9c94-4jmq8"] Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.346361 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8f48b4f88-jjg7s"] Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.393530 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2hkbh"] Mar 10 14:24:17 crc kubenswrapper[4911]: I0310 14:24:17.420619 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:17 crc kubenswrapper[4911]: W0310 14:24:17.693560 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a9f0716_d281_468e_9ba9_d77699793836.slice/crio-fd222f4273d70d5deb157861b202b50db69ada7835a061c31cf93a947443b273 WatchSource:0}: Error finding container fd222f4273d70d5deb157861b202b50db69ada7835a061c31cf93a947443b273: Status 404 returned error can't find the container with id fd222f4273d70d5deb157861b202b50db69ada7835a061c31cf93a947443b273 Mar 10 14:24:17 crc kubenswrapper[4911]: W0310 14:24:17.755125 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa83f400_7839_4e47_a578_73f2cad08034.slice/crio-7c3a05044f3d6005ed67943bd424c03919bf6131f5857992b68d360b1a2611d0 WatchSource:0}: Error finding container 7c3a05044f3d6005ed67943bd424c03919bf6131f5857992b68d360b1a2611d0: Status 404 returned error can't find the container with id 7c3a05044f3d6005ed67943bd424c03919bf6131f5857992b68d360b1a2611d0 Mar 10 14:24:17 crc kubenswrapper[4911]: W0310 14:24:17.771629 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28a4dbd7_7594_4787_a138_952c605462c2.slice/crio-c3c199f3a4a982567698cd9733a4cfc400eabc72755c38b7d88c69f34853efa6 WatchSource:0}: Error finding container c3c199f3a4a982567698cd9733a4cfc400eabc72755c38b7d88c69f34853efa6: Status 404 returned error can't find the container with id c3c199f3a4a982567698cd9733a4cfc400eabc72755c38b7d88c69f34853efa6 Mar 10 14:24:17 crc kubenswrapper[4911]: W0310 14:24:17.789248 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f163cfe_1829_40bc_8117_4b2b1c072a91.slice/crio-927bf3cc27c05a1937d0b82dc55ac910236cfadd20091a5b5ecedde445ce37a7 WatchSource:0}: Error finding container 927bf3cc27c05a1937d0b82dc55ac910236cfadd20091a5b5ecedde445ce37a7: Status 404 returned error can't find the container with id 927bf3cc27c05a1937d0b82dc55ac910236cfadd20091a5b5ecedde445ce37a7 Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.017104 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554f4c9c94-4jmq8" event={"ID":"aa83f400-7839-4e47-a578-73f2cad08034","Type":"ContainerStarted","Data":"7c3a05044f3d6005ed67943bd424c03919bf6131f5857992b68d360b1a2611d0"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.025026 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb47b4698-gx22c" event={"ID":"2689d664-9bf8-4c5b-8c53-353286854071","Type":"ContainerStarted","Data":"ee1bef23e2ca9f097cb51970aad290dffbc00574a522e7db7f01133146313fbe"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.027579 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.028871 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" event={"ID":"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6","Type":"ContainerStarted","Data":"09537fe13918ee4dd363a9d19b8351edd15d38db2cdf3bcc5879f493725bd4d1"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.087146 4911 generic.go:334] "Generic (PLEG): container finished" podID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerID="b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602" exitCode=0 Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.087187 4911 generic.go:334] "Generic (PLEG): container finished" podID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerID="1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176" exitCode=2 Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.087195 4911 generic.go:334] "Generic (PLEG): container finished" podID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerID="6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615" exitCode=0 Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.087243 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerDied","Data":"b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.087274 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerDied","Data":"1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.087285 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerDied","Data":"6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.114681 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-txf9k" event={"ID":"28a4dbd7-7594-4787-a138-952c605462c2","Type":"ContainerStarted","Data":"c3c199f3a4a982567698cd9733a4cfc400eabc72755c38b7d88c69f34853efa6"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.124790 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" event={"ID":"5322d280-17b4-489e-a867-6527ce33d3f4","Type":"ContainerStarted","Data":"32a4d954f24cac78ede323cd03a265b33fdca4880152fdfc1c8b0ec5808bc0ea"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.129263 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f163cfe-1829-40bc-8117-4b2b1c072a91","Type":"ContainerStarted","Data":"927bf3cc27c05a1937d0b82dc55ac910236cfadd20091a5b5ecedde445ce37a7"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.145002 4911 generic.go:334] "Generic (PLEG): container finished" podID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerID="9830e5c56d818f9394253b818790bc839d6a9e709579ea217ea4f8467a1ed1a3" exitCode=137 Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.145043 4911 generic.go:334] "Generic (PLEG): container finished" podID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerID="a869305265bf235adabf35ccd01a2083f5054d9da239357e1f4078c1119f14e2" exitCode=137 Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.145139 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54cf4db59f-7jqnh" event={"ID":"5046ce07-0ef5-4efc-aaf4-0867d7acea09","Type":"ContainerDied","Data":"9830e5c56d818f9394253b818790bc839d6a9e709579ea217ea4f8467a1ed1a3"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.145175 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54cf4db59f-7jqnh" event={"ID":"5046ce07-0ef5-4efc-aaf4-0867d7acea09","Type":"ContainerDied","Data":"a869305265bf235adabf35ccd01a2083f5054d9da239357e1f4078c1119f14e2"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.160211 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a9f0716-d281-468e-9ba9-d77699793836","Type":"ContainerStarted","Data":"fd222f4273d70d5deb157861b202b50db69ada7835a061c31cf93a947443b273"} Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.303084 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.426143 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5046ce07-0ef5-4efc-aaf4-0867d7acea09-horizon-secret-key\") pod \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.426860 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5046ce07-0ef5-4efc-aaf4-0867d7acea09-logs\") pod \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.426896 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nn5v\" (UniqueName: \"kubernetes.io/projected/5046ce07-0ef5-4efc-aaf4-0867d7acea09-kube-api-access-8nn5v\") pod \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.426995 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-config-data\") pod \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.427064 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-scripts\") pod \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\" (UID: \"5046ce07-0ef5-4efc-aaf4-0867d7acea09\") " Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.442260 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5046ce07-0ef5-4efc-aaf4-0867d7acea09-logs" (OuterVolumeSpecName: "logs") pod "5046ce07-0ef5-4efc-aaf4-0867d7acea09" (UID: "5046ce07-0ef5-4efc-aaf4-0867d7acea09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.447656 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5046ce07-0ef5-4efc-aaf4-0867d7acea09-kube-api-access-8nn5v" (OuterVolumeSpecName: "kube-api-access-8nn5v") pod "5046ce07-0ef5-4efc-aaf4-0867d7acea09" (UID: "5046ce07-0ef5-4efc-aaf4-0867d7acea09"). InnerVolumeSpecName "kube-api-access-8nn5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.452281 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5046ce07-0ef5-4efc-aaf4-0867d7acea09-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5046ce07-0ef5-4efc-aaf4-0867d7acea09" (UID: "5046ce07-0ef5-4efc-aaf4-0867d7acea09"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.482589 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78dcb5b94f-bjgh8"] Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.482903 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78dcb5b94f-bjgh8" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-api" containerID="cri-o://4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122" gracePeriod=30 Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.484048 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78dcb5b94f-bjgh8" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-httpd" containerID="cri-o://94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea" gracePeriod=30 Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.500322 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d48f5c7d5-2xxzq"] Mar 10 14:24:18 crc kubenswrapper[4911]: E0310 14:24:18.506973 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerName="horizon" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.507017 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerName="horizon" Mar 10 14:24:18 crc kubenswrapper[4911]: E0310 14:24:18.507080 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerName="horizon-log" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.507087 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerName="horizon-log" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.507475 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerName="horizon-log" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.507488 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" containerName="horizon" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.520475 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.520563 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.522249 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.537158 4911 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5046ce07-0ef5-4efc-aaf4-0867d7acea09-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.537188 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5046ce07-0ef5-4efc-aaf4-0867d7acea09-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.537198 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nn5v\" (UniqueName: \"kubernetes.io/projected/5046ce07-0ef5-4efc-aaf4-0867d7acea09-kube-api-access-8nn5v\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.555015 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-config-data" (OuterVolumeSpecName: "config-data") pod "5046ce07-0ef5-4efc-aaf4-0867d7acea09" (UID: "5046ce07-0ef5-4efc-aaf4-0867d7acea09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.557533 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d48f5c7d5-2xxzq"] Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.564638 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-scripts" (OuterVolumeSpecName: "scripts") pod "5046ce07-0ef5-4efc-aaf4-0867d7acea09" (UID: "5046ce07-0ef5-4efc-aaf4-0867d7acea09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639116 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cpww\" (UniqueName: \"kubernetes.io/projected/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-kube-api-access-8cpww\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639164 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-internal-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639231 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-httpd-config\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639256 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-config\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639287 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-ovndb-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639328 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-combined-ca-bundle\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639367 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-public-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639425 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.639441 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5046ce07-0ef5-4efc-aaf4-0867d7acea09-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.745123 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cpww\" (UniqueName: \"kubernetes.io/projected/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-kube-api-access-8cpww\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.745854 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-internal-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.745971 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-httpd-config\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.746009 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-config\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.746053 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-ovndb-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.746102 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-combined-ca-bundle\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.746162 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-public-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.754610 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-ovndb-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.759393 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-httpd-config\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.776974 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-combined-ca-bundle\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.779637 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-internal-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.780221 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cpww\" (UniqueName: \"kubernetes.io/projected/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-kube-api-access-8cpww\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.786640 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-public-tls-certs\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.788950 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5be3e6b2-8478-41bf-9fb1-09e053e8b5ac-config\") pod \"neutron-6d48f5c7d5-2xxzq\" (UID: \"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac\") " pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.864994 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-78dcb5b94f-bjgh8" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": read tcp 10.217.0.2:45764->10.217.0.160:9696: read: connection reset by peer" Mar 10 14:24:18 crc kubenswrapper[4911]: I0310 14:24:18.873865 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:18.996367 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.303056 4911 generic.go:334] "Generic (PLEG): container finished" podID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerID="94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea" exitCode=0 Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.303193 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78dcb5b94f-bjgh8" event={"ID":"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303","Type":"ContainerDied","Data":"94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea"} Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.452115 4911 generic.go:334] "Generic (PLEG): container finished" podID="28a4dbd7-7594-4787-a138-952c605462c2" containerID="587fc87c892d2b7d922841d475d936859dda2e5001a59b6f888afc665b852e29" exitCode=0 Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.452354 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-txf9k" event={"ID":"28a4dbd7-7594-4787-a138-952c605462c2","Type":"ContainerDied","Data":"587fc87c892d2b7d922841d475d936859dda2e5001a59b6f888afc665b852e29"} Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.456827 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.489860 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8599db9-k9r6m" event={"ID":"c049ccee-c503-43b1-b263-c6ee453e93e0","Type":"ContainerStarted","Data":"e65d935d3875bbdb1f5b7ac6669d557a333096872be06e21172b11f097f1b1de"} Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.521997 4911 generic.go:334] "Generic (PLEG): container finished" podID="5322d280-17b4-489e-a867-6527ce33d3f4" containerID="8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c" exitCode=0 Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.522081 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" event={"ID":"5322d280-17b4-489e-a867-6527ce33d3f4","Type":"ContainerDied","Data":"8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c"} Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.526174 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554f4c9c94-4jmq8" event={"ID":"aa83f400-7839-4e47-a578-73f2cad08034","Type":"ContainerStarted","Data":"a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840"} Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.526207 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554f4c9c94-4jmq8" event={"ID":"aa83f400-7839-4e47-a578-73f2cad08034","Type":"ContainerStarted","Data":"3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1"} Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.527173 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.527195 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.583806 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb47b4698-gx22c" event={"ID":"2689d664-9bf8-4c5b-8c53-353286854071","Type":"ContainerStarted","Data":"79e05dd5e220f1009f15dac2b41a12da92e4b8bf39462c889ee0d8ce45de35c6"} Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.657328 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54cf4db59f-7jqnh" event={"ID":"5046ce07-0ef5-4efc-aaf4-0867d7acea09","Type":"ContainerDied","Data":"50d04e31115973ce6dd5d4a98c5f413e55e28be2c72c9ee25a7b43ac0abd8dd6"} Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.657413 4911 scope.go:117] "RemoveContainer" containerID="9830e5c56d818f9394253b818790bc839d6a9e709579ea217ea4f8467a1ed1a3" Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.657606 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54cf4db59f-7jqnh" Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.705744 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-554f4c9c94-4jmq8" podStartSLOduration=12.705652163 podStartE2EDuration="12.705652163s" podCreationTimestamp="2026-03-10 14:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:19.561408824 +0000 UTC m=+1364.124928741" watchObservedRunningTime="2026-03-10 14:24:19.705652163 +0000 UTC m=+1364.269172080" Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.806967 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54cf4db59f-7jqnh"] Mar 10 14:24:19 crc kubenswrapper[4911]: I0310 14:24:19.836047 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54cf4db59f-7jqnh"] Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.102873 4911 scope.go:117] "RemoveContainer" containerID="a869305265bf235adabf35ccd01a2083f5054d9da239357e1f4078c1119f14e2" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.232494 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5046ce07-0ef5-4efc-aaf4-0867d7acea09" path="/var/lib/kubelet/pods/5046ce07-0ef5-4efc-aaf4-0867d7acea09/volumes" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.306553 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-78dcb5b94f-bjgh8" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": dial tcp 10.217.0.160:9696: connect: connection refused" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.325261 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d48f5c7d5-2xxzq"] Mar 10 14:24:20 crc kubenswrapper[4911]: W0310 14:24:20.358609 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5be3e6b2_8478_41bf_9fb1_09e053e8b5ac.slice/crio-d3feacc3331804e1712f41fc23e6f1505c7e388c8e66bd2eac46928c7c297d67 WatchSource:0}: Error finding container d3feacc3331804e1712f41fc23e6f1505c7e388c8e66bd2eac46928c7c297d67: Status 404 returned error can't find the container with id d3feacc3331804e1712f41fc23e6f1505c7e388c8e66bd2eac46928c7c297d67 Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.488936 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.521614 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-svc\") pod \"28a4dbd7-7594-4787-a138-952c605462c2\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.522190 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-swift-storage-0\") pod \"28a4dbd7-7594-4787-a138-952c605462c2\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.522356 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-nb\") pod \"28a4dbd7-7594-4787-a138-952c605462c2\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.522961 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-sb\") pod \"28a4dbd7-7594-4787-a138-952c605462c2\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.523064 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x9bm\" (UniqueName: \"kubernetes.io/projected/28a4dbd7-7594-4787-a138-952c605462c2-kube-api-access-9x9bm\") pod \"28a4dbd7-7594-4787-a138-952c605462c2\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.523330 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-config\") pod \"28a4dbd7-7594-4787-a138-952c605462c2\" (UID: \"28a4dbd7-7594-4787-a138-952c605462c2\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.544946 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a4dbd7-7594-4787-a138-952c605462c2-kube-api-access-9x9bm" (OuterVolumeSpecName: "kube-api-access-9x9bm") pod "28a4dbd7-7594-4787-a138-952c605462c2" (UID: "28a4dbd7-7594-4787-a138-952c605462c2"). InnerVolumeSpecName "kube-api-access-9x9bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.576314 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.630626 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-config-data\") pod \"add8035b-c6a4-47d1-aa42-ed381ba87b11\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.630803 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-log-httpd\") pod \"add8035b-c6a4-47d1-aa42-ed381ba87b11\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.630943 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-combined-ca-bundle\") pod \"add8035b-c6a4-47d1-aa42-ed381ba87b11\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.631142 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-sg-core-conf-yaml\") pod \"add8035b-c6a4-47d1-aa42-ed381ba87b11\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.631383 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pfvx\" (UniqueName: \"kubernetes.io/projected/add8035b-c6a4-47d1-aa42-ed381ba87b11-kube-api-access-4pfvx\") pod \"add8035b-c6a4-47d1-aa42-ed381ba87b11\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.632211 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-scripts\") pod \"add8035b-c6a4-47d1-aa42-ed381ba87b11\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.632505 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-run-httpd\") pod \"add8035b-c6a4-47d1-aa42-ed381ba87b11\" (UID: \"add8035b-c6a4-47d1-aa42-ed381ba87b11\") " Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.637290 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x9bm\" (UniqueName: \"kubernetes.io/projected/28a4dbd7-7594-4787-a138-952c605462c2-kube-api-access-9x9bm\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.641846 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "add8035b-c6a4-47d1-aa42-ed381ba87b11" (UID: "add8035b-c6a4-47d1-aa42-ed381ba87b11"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.647365 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "add8035b-c6a4-47d1-aa42-ed381ba87b11" (UID: "add8035b-c6a4-47d1-aa42-ed381ba87b11"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.662738 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add8035b-c6a4-47d1-aa42-ed381ba87b11-kube-api-access-4pfvx" (OuterVolumeSpecName: "kube-api-access-4pfvx") pod "add8035b-c6a4-47d1-aa42-ed381ba87b11" (UID: "add8035b-c6a4-47d1-aa42-ed381ba87b11"). InnerVolumeSpecName "kube-api-access-4pfvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.664343 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-scripts" (OuterVolumeSpecName: "scripts") pod "add8035b-c6a4-47d1-aa42-ed381ba87b11" (UID: "add8035b-c6a4-47d1-aa42-ed381ba87b11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.675553 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d48f5c7d5-2xxzq" event={"ID":"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac","Type":"ContainerStarted","Data":"d3feacc3331804e1712f41fc23e6f1505c7e388c8e66bd2eac46928c7c297d67"} Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.681037 4911 generic.go:334] "Generic (PLEG): container finished" podID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerID="5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46" exitCode=0 Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.681100 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerDied","Data":"5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46"} Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.681132 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"add8035b-c6a4-47d1-aa42-ed381ba87b11","Type":"ContainerDied","Data":"0a00b1ddede815fc87526cf7ea6fd44510c226b1ae58cfae4b10d42d81dfb6d8"} Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.681151 4911 scope.go:117] "RemoveContainer" containerID="b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.681270 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.685436 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-txf9k" event={"ID":"28a4dbd7-7594-4787-a138-952c605462c2","Type":"ContainerDied","Data":"c3c199f3a4a982567698cd9733a4cfc400eabc72755c38b7d88c69f34853efa6"} Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.685575 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-txf9k" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.699306 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8599db9-k9r6m" event={"ID":"c049ccee-c503-43b1-b263-c6ee453e93e0","Type":"ContainerStarted","Data":"7c94e95bdfa2384e5b77cc08ac2972e57990aa7dbedd3066da0af4fbd8180c38"} Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.706167 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f163cfe-1829-40bc-8117-4b2b1c072a91","Type":"ContainerStarted","Data":"90b8e611802de56be1692c1dae6b7ac2dbd66f75bfb60ee25e92e7102cf0c613"} Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.713846 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb47b4698-gx22c" event={"ID":"2689d664-9bf8-4c5b-8c53-353286854071","Type":"ContainerStarted","Data":"311056b2b43d424fa57308e0f2c084c8b1ac0def84f00286bcfb1ac2802ada9f"} Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.713939 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.714004 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.733111 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8599db9-k9r6m" podStartSLOduration=9.773119743 podStartE2EDuration="14.733085513s" podCreationTimestamp="2026-03-10 14:24:06 +0000 UTC" firstStartedPulling="2026-03-10 14:24:12.984986358 +0000 UTC m=+1357.548506295" lastFinishedPulling="2026-03-10 14:24:17.944952148 +0000 UTC m=+1362.508472065" observedRunningTime="2026-03-10 14:24:20.729864328 +0000 UTC m=+1365.293384265" watchObservedRunningTime="2026-03-10 14:24:20.733085513 +0000 UTC m=+1365.296605430" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.739035 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.739064 4911 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.739073 4911 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/add8035b-c6a4-47d1-aa42-ed381ba87b11-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.739082 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pfvx\" (UniqueName: \"kubernetes.io/projected/add8035b-c6a4-47d1-aa42-ed381ba87b11-kube-api-access-4pfvx\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.748137 4911 scope.go:117] "RemoveContainer" containerID="1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.753256 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "28a4dbd7-7594-4787-a138-952c605462c2" (UID: "28a4dbd7-7594-4787-a138-952c605462c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.768463 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fb47b4698-gx22c" podStartSLOduration=10.768431035999999 podStartE2EDuration="10.768431036s" podCreationTimestamp="2026-03-10 14:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:20.76171385 +0000 UTC m=+1365.325233777" watchObservedRunningTime="2026-03-10 14:24:20.768431036 +0000 UTC m=+1365.331950953" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.786191 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-config" (OuterVolumeSpecName: "config") pod "28a4dbd7-7594-4787-a138-952c605462c2" (UID: "28a4dbd7-7594-4787-a138-952c605462c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.787575 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "28a4dbd7-7594-4787-a138-952c605462c2" (UID: "28a4dbd7-7594-4787-a138-952c605462c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.801948 4911 scope.go:117] "RemoveContainer" containerID="5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.824313 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28a4dbd7-7594-4787-a138-952c605462c2" (UID: "28a4dbd7-7594-4787-a138-952c605462c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.842168 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.842261 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.842292 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.842339 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.844167 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "28a4dbd7-7594-4787-a138-952c605462c2" (UID: "28a4dbd7-7594-4787-a138-952c605462c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.901312 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "add8035b-c6a4-47d1-aa42-ed381ba87b11" (UID: "add8035b-c6a4-47d1-aa42-ed381ba87b11"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.960282 4911 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:20 crc kubenswrapper[4911]: I0310 14:24:20.960322 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a4dbd7-7594-4787-a138-952c605462c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.009940 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "add8035b-c6a4-47d1-aa42-ed381ba87b11" (UID: "add8035b-c6a4-47d1-aa42-ed381ba87b11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.068208 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.120297 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-config-data" (OuterVolumeSpecName: "config-data") pod "add8035b-c6a4-47d1-aa42-ed381ba87b11" (UID: "add8035b-c6a4-47d1-aa42-ed381ba87b11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.170083 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add8035b-c6a4-47d1-aa42-ed381ba87b11-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.247477 4911 scope.go:117] "RemoveContainer" containerID="6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.296080 4911 scope.go:117] "RemoveContainer" containerID="b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602" Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.302932 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602\": container with ID starting with b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602 not found: ID does not exist" containerID="b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.303397 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602"} err="failed to get container status \"b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602\": rpc error: code = NotFound desc = could not find container \"b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602\": container with ID starting with b2fd82bdb2234f876e135d235caface57ca2ed729c3f7335338c1accfa603602 not found: ID does not exist" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.303521 4911 scope.go:117] "RemoveContainer" containerID="1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176" Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.305023 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176\": container with ID starting with 1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176 not found: ID does not exist" containerID="1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.305130 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176"} err="failed to get container status \"1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176\": rpc error: code = NotFound desc = could not find container \"1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176\": container with ID starting with 1702ea919fd66ea3e6e466925b108e7d64aef647d8b9c9739f5e475f2e48f176 not found: ID does not exist" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.305218 4911 scope.go:117] "RemoveContainer" containerID="5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46" Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.307841 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46\": container with ID starting with 5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46 not found: ID does not exist" containerID="5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.307974 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46"} err="failed to get container status \"5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46\": rpc error: code = NotFound desc = could not find container \"5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46\": container with ID starting with 5fb7c972d5a845abba30ef2c793fcb831ddaae548fd1f972367e6bd466b6bd46 not found: ID does not exist" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.308071 4911 scope.go:117] "RemoveContainer" containerID="6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615" Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.322983 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615\": container with ID starting with 6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615 not found: ID does not exist" containerID="6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.323333 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615"} err="failed to get container status \"6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615\": rpc error: code = NotFound desc = could not find container \"6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615\": container with ID starting with 6f34303b2c0986655eac497b678cef1fc376801ab91d3eb9b38c44f5a9cff615 not found: ID does not exist" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.323450 4911 scope.go:117] "RemoveContainer" containerID="587fc87c892d2b7d922841d475d936859dda2e5001a59b6f888afc665b852e29" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.326856 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-txf9k"] Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.349046 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-txf9k"] Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.382184 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.396878 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.418243 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.418863 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="ceilometer-central-agent" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.418883 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="ceilometer-central-agent" Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.418902 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="sg-core" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.418910 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="sg-core" Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.418933 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="ceilometer-notification-agent" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.418939 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="ceilometer-notification-agent" Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.418951 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="proxy-httpd" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.418959 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="proxy-httpd" Mar 10 14:24:21 crc kubenswrapper[4911]: E0310 14:24:21.418976 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a4dbd7-7594-4787-a138-952c605462c2" containerName="init" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.418984 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a4dbd7-7594-4787-a138-952c605462c2" containerName="init" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.419183 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="ceilometer-notification-agent" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.419201 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="sg-core" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.419216 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="proxy-httpd" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.419231 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a4dbd7-7594-4787-a138-952c605462c2" containerName="init" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.419242 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" containerName="ceilometer-central-agent" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.422494 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.422636 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.425925 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.427870 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.484478 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-run-httpd\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.484531 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-scripts\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.484621 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.484651 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-config-data\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.484701 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm78x\" (UniqueName: \"kubernetes.io/projected/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-kube-api-access-qm78x\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.484744 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.484792 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-log-httpd\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.589323 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.589386 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-config-data\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.589703 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm78x\" (UniqueName: \"kubernetes.io/projected/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-kube-api-access-qm78x\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.589821 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.589923 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-log-httpd\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.590079 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-run-httpd\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.590111 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-scripts\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.594117 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-log-httpd\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.594717 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-run-httpd\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.599513 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.606306 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.607064 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-scripts\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.622124 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-config-data\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.626607 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm78x\" (UniqueName: \"kubernetes.io/projected/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-kube-api-access-qm78x\") pod \"ceilometer-0\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.758228 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" event={"ID":"5322d280-17b4-489e-a867-6527ce33d3f4","Type":"ContainerStarted","Data":"5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de"} Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.758837 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.771064 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a9f0716-d281-468e-9ba9-d77699793836","Type":"ContainerStarted","Data":"0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765"} Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.781907 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f163cfe-1829-40bc-8117-4b2b1c072a91","Type":"ContainerStarted","Data":"12187f49867774c713e5a65a1ecb0edf9ac95261f07a7e155265df778b5a4df2"} Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.782104 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerName="cinder-api-log" containerID="cri-o://90b8e611802de56be1692c1dae6b7ac2dbd66f75bfb60ee25e92e7102cf0c613" gracePeriod=30 Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.782393 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.782435 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerName="cinder-api" containerID="cri-o://12187f49867774c713e5a65a1ecb0edf9ac95261f07a7e155265df778b5a4df2" gracePeriod=30 Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.783102 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.787122 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" podStartSLOduration=7.787109297 podStartE2EDuration="7.787109297s" podCreationTimestamp="2026-03-10 14:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:21.786614114 +0000 UTC m=+1366.350134041" watchObservedRunningTime="2026-03-10 14:24:21.787109297 +0000 UTC m=+1366.350629214" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.804193 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d48f5c7d5-2xxzq" event={"ID":"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac","Type":"ContainerStarted","Data":"cfecda72bbf0a6b55c159b64a058e7a1e93908eac3d94e8d1baf3707231ebedf"} Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.804266 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d48f5c7d5-2xxzq" event={"ID":"5be3e6b2-8478-41bf-9fb1-09e053e8b5ac","Type":"ContainerStarted","Data":"63d3cb927e6851e199c6fa504534bc0eee72f18a397a6b7a0c0c2d6831243874"} Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.805490 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.813791 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.813767404 podStartE2EDuration="7.813767404s" podCreationTimestamp="2026-03-10 14:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:21.81170968 +0000 UTC m=+1366.375229597" watchObservedRunningTime="2026-03-10 14:24:21.813767404 +0000 UTC m=+1366.377287341" Mar 10 14:24:21 crc kubenswrapper[4911]: I0310 14:24:21.879921 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d48f5c7d5-2xxzq" podStartSLOduration=3.879893481 podStartE2EDuration="3.879893481s" podCreationTimestamp="2026-03-10 14:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:21.842174566 +0000 UTC m=+1366.405694503" watchObservedRunningTime="2026-03-10 14:24:21.879893481 +0000 UTC m=+1366.443413408" Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.033629 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-54d884b5d4-lsz26" Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.117910 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b4dd68964-gfvp8"] Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.118670 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b4dd68964-gfvp8" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon-log" containerID="cri-o://33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792" gracePeriod=30 Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.118906 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b4dd68964-gfvp8" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" containerID="cri-o://dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a" gracePeriod=30 Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.159014 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b4dd68964-gfvp8" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.240526 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a4dbd7-7594-4787-a138-952c605462c2" path="/var/lib/kubelet/pods/28a4dbd7-7594-4787-a138-952c605462c2/volumes" Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.241445 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add8035b-c6a4-47d1-aa42-ed381ba87b11" path="/var/lib/kubelet/pods/add8035b-c6a4-47d1-aa42-ed381ba87b11/volumes" Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.848460 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a9f0716-d281-468e-9ba9-d77699793836","Type":"ContainerStarted","Data":"8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae"} Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.853996 4911 generic.go:334] "Generic (PLEG): container finished" podID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerID="12187f49867774c713e5a65a1ecb0edf9ac95261f07a7e155265df778b5a4df2" exitCode=0 Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.854028 4911 generic.go:334] "Generic (PLEG): container finished" podID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerID="90b8e611802de56be1692c1dae6b7ac2dbd66f75bfb60ee25e92e7102cf0c613" exitCode=143 Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.854116 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f163cfe-1829-40bc-8117-4b2b1c072a91","Type":"ContainerDied","Data":"12187f49867774c713e5a65a1ecb0edf9ac95261f07a7e155265df778b5a4df2"} Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.854152 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f163cfe-1829-40bc-8117-4b2b1c072a91","Type":"ContainerDied","Data":"90b8e611802de56be1692c1dae6b7ac2dbd66f75bfb60ee25e92e7102cf0c613"} Mar 10 14:24:22 crc kubenswrapper[4911]: I0310 14:24:22.870603 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.444559549 podStartE2EDuration="8.87057707s" podCreationTimestamp="2026-03-10 14:24:14 +0000 UTC" firstStartedPulling="2026-03-10 14:24:17.695890252 +0000 UTC m=+1362.259410169" lastFinishedPulling="2026-03-10 14:24:19.121907773 +0000 UTC m=+1363.685427690" observedRunningTime="2026-03-10 14:24:22.866986396 +0000 UTC m=+1367.430506313" watchObservedRunningTime="2026-03-10 14:24:22.87057707 +0000 UTC m=+1367.434096977" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.392800 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.465610 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-combined-ca-bundle\") pod \"2f163cfe-1829-40bc-8117-4b2b1c072a91\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.465670 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f163cfe-1829-40bc-8117-4b2b1c072a91-logs\") pod \"2f163cfe-1829-40bc-8117-4b2b1c072a91\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.465775 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-scripts\") pod \"2f163cfe-1829-40bc-8117-4b2b1c072a91\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.465813 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data\") pod \"2f163cfe-1829-40bc-8117-4b2b1c072a91\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.465936 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66br7\" (UniqueName: \"kubernetes.io/projected/2f163cfe-1829-40bc-8117-4b2b1c072a91-kube-api-access-66br7\") pod \"2f163cfe-1829-40bc-8117-4b2b1c072a91\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.465967 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data-custom\") pod \"2f163cfe-1829-40bc-8117-4b2b1c072a91\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.465993 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f163cfe-1829-40bc-8117-4b2b1c072a91-etc-machine-id\") pod \"2f163cfe-1829-40bc-8117-4b2b1c072a91\" (UID: \"2f163cfe-1829-40bc-8117-4b2b1c072a91\") " Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.466453 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f163cfe-1829-40bc-8117-4b2b1c072a91-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2f163cfe-1829-40bc-8117-4b2b1c072a91" (UID: "2f163cfe-1829-40bc-8117-4b2b1c072a91"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.472671 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f163cfe-1829-40bc-8117-4b2b1c072a91-logs" (OuterVolumeSpecName: "logs") pod "2f163cfe-1829-40bc-8117-4b2b1c072a91" (UID: "2f163cfe-1829-40bc-8117-4b2b1c072a91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.488348 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f163cfe-1829-40bc-8117-4b2b1c072a91-kube-api-access-66br7" (OuterVolumeSpecName: "kube-api-access-66br7") pod "2f163cfe-1829-40bc-8117-4b2b1c072a91" (UID: "2f163cfe-1829-40bc-8117-4b2b1c072a91"). InnerVolumeSpecName "kube-api-access-66br7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.503964 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-scripts" (OuterVolumeSpecName: "scripts") pod "2f163cfe-1829-40bc-8117-4b2b1c072a91" (UID: "2f163cfe-1829-40bc-8117-4b2b1c072a91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.512476 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2f163cfe-1829-40bc-8117-4b2b1c072a91" (UID: "2f163cfe-1829-40bc-8117-4b2b1c072a91"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.553611 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f163cfe-1829-40bc-8117-4b2b1c072a91" (UID: "2f163cfe-1829-40bc-8117-4b2b1c072a91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.569045 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66br7\" (UniqueName: \"kubernetes.io/projected/2f163cfe-1829-40bc-8117-4b2b1c072a91-kube-api-access-66br7\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.569079 4911 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.569089 4911 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f163cfe-1829-40bc-8117-4b2b1c072a91-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.569100 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.569110 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f163cfe-1829-40bc-8117-4b2b1c072a91-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.569128 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.587499 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data" (OuterVolumeSpecName: "config-data") pod "2f163cfe-1829-40bc-8117-4b2b1c072a91" (UID: "2f163cfe-1829-40bc-8117-4b2b1c072a91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.648613 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.671280 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f163cfe-1829-40bc-8117-4b2b1c072a91-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.869288 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f163cfe-1829-40bc-8117-4b2b1c072a91","Type":"ContainerDied","Data":"927bf3cc27c05a1937d0b82dc55ac910236cfadd20091a5b5ecedde445ce37a7"} Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.870415 4911 scope.go:117] "RemoveContainer" containerID="12187f49867774c713e5a65a1ecb0edf9ac95261f07a7e155265df778b5a4df2" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.869404 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.887507 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" event={"ID":"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6","Type":"ContainerStarted","Data":"4ddc1af45bf489251aec2cac77098e4f42970e984077a0a220878e58c9e0f617"} Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.887565 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" event={"ID":"d66c76f8-6b9a-40d3-b5fc-d2d5790928f6","Type":"ContainerStarted","Data":"f7f8c13fa933556cdd52ca79309ad0e69fdaad4de80539b4625f84878c1dcec1"} Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.891549 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerStarted","Data":"bc8dee9ba930cb700639508b1a67c7c3ff11f7de49ba5186ff2ead05faff6b12"} Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.914317 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8f48b4f88-jjg7s" podStartSLOduration=11.750817181 podStartE2EDuration="16.914292976s" podCreationTimestamp="2026-03-10 14:24:07 +0000 UTC" firstStartedPulling="2026-03-10 14:24:17.785618016 +0000 UTC m=+1362.349137933" lastFinishedPulling="2026-03-10 14:24:22.949093811 +0000 UTC m=+1367.512613728" observedRunningTime="2026-03-10 14:24:23.909426719 +0000 UTC m=+1368.472946636" watchObservedRunningTime="2026-03-10 14:24:23.914292976 +0000 UTC m=+1368.477812893" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.922016 4911 scope.go:117] "RemoveContainer" containerID="90b8e611802de56be1692c1dae6b7ac2dbd66f75bfb60ee25e92e7102cf0c613" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.941602 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.956411 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.986184 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:23 crc kubenswrapper[4911]: E0310 14:24:23.986756 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerName="cinder-api-log" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.986776 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerName="cinder-api-log" Mar 10 14:24:23 crc kubenswrapper[4911]: E0310 14:24:23.986809 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerName="cinder-api" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.986816 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerName="cinder-api" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.987027 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerName="cinder-api-log" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.987061 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" containerName="cinder-api" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.988694 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.993265 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.993470 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.993597 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 14:24:23 crc kubenswrapper[4911]: I0310 14:24:23.997929 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.079246 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.079668 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptnv\" (UniqueName: \"kubernetes.io/projected/6e666af1-a2f4-4aa0-95c6-f8568be705d8-kube-api-access-7ptnv\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.082967 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-config-data-custom\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.083062 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.088846 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.089222 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e666af1-a2f4-4aa0-95c6-f8568be705d8-logs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.089299 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-config-data\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.089411 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e666af1-a2f4-4aa0-95c6-f8568be705d8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.089502 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-scripts\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192154 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptnv\" (UniqueName: \"kubernetes.io/projected/6e666af1-a2f4-4aa0-95c6-f8568be705d8-kube-api-access-7ptnv\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192599 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-config-data-custom\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192623 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192662 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192711 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e666af1-a2f4-4aa0-95c6-f8568be705d8-logs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192748 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-config-data\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192776 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e666af1-a2f4-4aa0-95c6-f8568be705d8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192802 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-scripts\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.192831 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.193303 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e666af1-a2f4-4aa0-95c6-f8568be705d8-logs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.193382 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e666af1-a2f4-4aa0-95c6-f8568be705d8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.199472 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.200457 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.204711 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-config-data-custom\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.206128 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.208425 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f163cfe-1829-40bc-8117-4b2b1c072a91" path="/var/lib/kubelet/pods/2f163cfe-1829-40bc-8117-4b2b1c072a91/volumes" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.209306 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-scripts\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.210056 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e666af1-a2f4-4aa0-95c6-f8568be705d8-config-data\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.219087 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptnv\" (UniqueName: \"kubernetes.io/projected/6e666af1-a2f4-4aa0-95c6-f8568be705d8-kube-api-access-7ptnv\") pod \"cinder-api-0\" (UID: \"6e666af1-a2f4-4aa0-95c6-f8568be705d8\") " pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.342780 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.815528 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.903086 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerStarted","Data":"866faac80a64e67dacdefb7c8776027cabd266ad71e5c3c0a327a4e9279af85e"} Mar 10 14:24:24 crc kubenswrapper[4911]: I0310 14:24:24.903781 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 14:24:24 crc kubenswrapper[4911]: W0310 14:24:24.910776 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e666af1_a2f4_4aa0_95c6_f8568be705d8.slice/crio-7b87abd3f1d490a3b7392f41a197190c5b9da7e50ad0be152b25163422d35fad WatchSource:0}: Error finding container 7b87abd3f1d490a3b7392f41a197190c5b9da7e50ad0be152b25163422d35fad: Status 404 returned error can't find the container with id 7b87abd3f1d490a3b7392f41a197190c5b9da7e50ad0be152b25163422d35fad Mar 10 14:24:25 crc kubenswrapper[4911]: I0310 14:24:25.930126 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e666af1-a2f4-4aa0-95c6-f8568be705d8","Type":"ContainerStarted","Data":"3220236c08ff92b018773213d34ba587434ebd072162136f6adc84d9f0c8010f"} Mar 10 14:24:25 crc kubenswrapper[4911]: I0310 14:24:25.930483 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e666af1-a2f4-4aa0-95c6-f8568be705d8","Type":"ContainerStarted","Data":"7b87abd3f1d490a3b7392f41a197190c5b9da7e50ad0be152b25163422d35fad"} Mar 10 14:24:26 crc kubenswrapper[4911]: I0310 14:24:26.812937 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b4dd68964-gfvp8" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:50554->10.217.0.153:8443: read: connection reset by peer" Mar 10 14:24:26 crc kubenswrapper[4911]: I0310 14:24:26.815153 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b4dd68964-gfvp8" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 10 14:24:26 crc kubenswrapper[4911]: I0310 14:24:26.970597 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerStarted","Data":"48fd44ee7dc5ab3ad38b7dcd6bb8303be8e6722aaacdabe481b5163f90450c29"} Mar 10 14:24:26 crc kubenswrapper[4911]: I0310 14:24:26.994152 4911 generic.go:334] "Generic (PLEG): container finished" podID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerID="dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a" exitCode=0 Mar 10 14:24:26 crc kubenswrapper[4911]: I0310 14:24:26.994244 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4dd68964-gfvp8" event={"ID":"a546f2b5-3536-4608-b1f4-0127ebd52bfa","Type":"ContainerDied","Data":"dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a"} Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.766532 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.884254 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-public-tls-certs\") pod \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.884355 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-internal-tls-certs\") pod \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.884424 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-combined-ca-bundle\") pod \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.884503 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-httpd-config\") pod \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.884584 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9rpg\" (UniqueName: \"kubernetes.io/projected/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-kube-api-access-v9rpg\") pod \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.884622 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-ovndb-tls-certs\") pod \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.884703 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-config\") pod \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\" (UID: \"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303\") " Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.891875 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-kube-api-access-v9rpg" (OuterVolumeSpecName: "kube-api-access-v9rpg") pod "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" (UID: "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303"). InnerVolumeSpecName "kube-api-access-v9rpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.907430 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" (UID: "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.977321 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" (UID: "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.986583 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-config" (OuterVolumeSpecName: "config") pod "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" (UID: "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.988168 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9rpg\" (UniqueName: \"kubernetes.io/projected/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-kube-api-access-v9rpg\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.988202 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.988214 4911 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:27 crc kubenswrapper[4911]: I0310 14:24:27.988225 4911 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.005735 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" (UID: "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.011824 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerStarted","Data":"c0b1da9b0147c06e30ea28bc2aff6419b9f0935e314f93b995d03d7ef0ae6e25"} Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.018652 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e666af1-a2f4-4aa0-95c6-f8568be705d8","Type":"ContainerStarted","Data":"6a24b436ea84a9cbe7dcda531581ed42f777f33730a6d27710a1dd9c3add8f38"} Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.018814 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.021932 4911 generic.go:334] "Generic (PLEG): container finished" podID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerID="4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122" exitCode=0 Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.021972 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78dcb5b94f-bjgh8" event={"ID":"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303","Type":"ContainerDied","Data":"4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122"} Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.022003 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78dcb5b94f-bjgh8" event={"ID":"6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303","Type":"ContainerDied","Data":"b91eb1af87fc3b56e6bcef353c9517fb08927f743e4a72a9b79e8eb6f672f0b0"} Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.022024 4911 scope.go:117] "RemoveContainer" containerID="94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.022199 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78dcb5b94f-bjgh8" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.036153 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.036949 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" (UID: "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.062186 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.062160341 podStartE2EDuration="5.062160341s" podCreationTimestamp="2026-03-10 14:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:28.041593074 +0000 UTC m=+1372.605113001" watchObservedRunningTime="2026-03-10 14:24:28.062160341 +0000 UTC m=+1372.625680258" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.076322 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" (UID: "6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.090210 4911 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.090242 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.090252 4911 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.177077 4911 scope.go:117] "RemoveContainer" containerID="4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.214861 4911 scope.go:117] "RemoveContainer" containerID="94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea" Mar 10 14:24:28 crc kubenswrapper[4911]: E0310 14:24:28.215934 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea\": container with ID starting with 94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea not found: ID does not exist" containerID="94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.215980 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea"} err="failed to get container status \"94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea\": rpc error: code = NotFound desc = could not find container \"94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea\": container with ID starting with 94a6473eb3a91715e2483b91d4336b4fcebd9676bf0750276cded5b8e9f900ea not found: ID does not exist" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.216007 4911 scope.go:117] "RemoveContainer" containerID="4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122" Mar 10 14:24:28 crc kubenswrapper[4911]: E0310 14:24:28.216475 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122\": container with ID starting with 4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122 not found: ID does not exist" containerID="4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.216516 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122"} err="failed to get container status \"4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122\": rpc error: code = NotFound desc = could not find container \"4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122\": container with ID starting with 4bed1e4516c7d4b5bac9fdb1f4cfd8f4eec757ffc856108e4fb330aaa9e42122 not found: ID does not exist" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.352025 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78dcb5b94f-bjgh8"] Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.373199 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78dcb5b94f-bjgh8"] Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.578757 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fb47b4698-gx22c" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.665105 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-554f4c9c94-4jmq8"] Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.665688 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-554f4c9c94-4jmq8" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api-log" containerID="cri-o://3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1" gracePeriod=30 Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.666276 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-554f4c9c94-4jmq8" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api" containerID="cri-o://a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840" gracePeriod=30 Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.672361 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-554f4c9c94-4jmq8" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": EOF" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.673032 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-554f4c9c94-4jmq8" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": EOF" Mar 10 14:24:28 crc kubenswrapper[4911]: I0310 14:24:28.677978 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-554f4c9c94-4jmq8" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": EOF" Mar 10 14:24:29 crc kubenswrapper[4911]: I0310 14:24:29.043809 4911 generic.go:334] "Generic (PLEG): container finished" podID="aa83f400-7839-4e47-a578-73f2cad08034" containerID="3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1" exitCode=143 Mar 10 14:24:29 crc kubenswrapper[4911]: I0310 14:24:29.043901 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554f4c9c94-4jmq8" event={"ID":"aa83f400-7839-4e47-a578-73f2cad08034","Type":"ContainerDied","Data":"3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1"} Mar 10 14:24:29 crc kubenswrapper[4911]: I0310 14:24:29.932917 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.051600 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-xgm4j"] Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.051979 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" podUID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" containerName="dnsmasq-dns" containerID="cri-o://bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901" gracePeriod=10 Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.096119 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerStarted","Data":"ebaf072307ac2a9de2ac29ff4920cf0b86782a1efe71c408f5474c0ba69180ce"} Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.096979 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.125891 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.332695824 podStartE2EDuration="9.125872271s" podCreationTimestamp="2026-03-10 14:24:21 +0000 UTC" firstStartedPulling="2026-03-10 14:24:23.662426986 +0000 UTC m=+1368.225946903" lastFinishedPulling="2026-03-10 14:24:29.455603433 +0000 UTC m=+1374.019123350" observedRunningTime="2026-03-10 14:24:30.125293876 +0000 UTC m=+1374.688813783" watchObservedRunningTime="2026-03-10 14:24:30.125872271 +0000 UTC m=+1374.689392178" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.209863 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" path="/var/lib/kubelet/pods/6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303/volumes" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.709356 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.722087 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.829597 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.878546 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-swift-storage-0\") pod \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.878606 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-nb\") pod \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.878662 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-svc\") pod \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.878712 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-sb\") pod \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.878796 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz57k\" (UniqueName: \"kubernetes.io/projected/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-kube-api-access-bz57k\") pod \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.878863 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-config\") pod \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\" (UID: \"7a0a0bdc-e60b-4a14-b244-49619bde6bd6\") " Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.888350 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-kube-api-access-bz57k" (OuterVolumeSpecName: "kube-api-access-bz57k") pod "7a0a0bdc-e60b-4a14-b244-49619bde6bd6" (UID: "7a0a0bdc-e60b-4a14-b244-49619bde6bd6"). InnerVolumeSpecName "kube-api-access-bz57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.968577 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-config" (OuterVolumeSpecName: "config") pod "7a0a0bdc-e60b-4a14-b244-49619bde6bd6" (UID: "7a0a0bdc-e60b-4a14-b244-49619bde6bd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.979258 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a0a0bdc-e60b-4a14-b244-49619bde6bd6" (UID: "7a0a0bdc-e60b-4a14-b244-49619bde6bd6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.980895 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.980932 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.980947 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz57k\" (UniqueName: \"kubernetes.io/projected/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-kube-api-access-bz57k\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:30 crc kubenswrapper[4911]: I0310 14:24:30.998370 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a0a0bdc-e60b-4a14-b244-49619bde6bd6" (UID: "7a0a0bdc-e60b-4a14-b244-49619bde6bd6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.014328 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a0a0bdc-e60b-4a14-b244-49619bde6bd6" (UID: "7a0a0bdc-e60b-4a14-b244-49619bde6bd6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.018932 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a0a0bdc-e60b-4a14-b244-49619bde6bd6" (UID: "7a0a0bdc-e60b-4a14-b244-49619bde6bd6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.083581 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.083626 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.083637 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0a0bdc-e60b-4a14-b244-49619bde6bd6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.148319 4911 generic.go:334] "Generic (PLEG): container finished" podID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" containerID="bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901" exitCode=0 Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.148460 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" event={"ID":"7a0a0bdc-e60b-4a14-b244-49619bde6bd6","Type":"ContainerDied","Data":"bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901"} Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.148524 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" event={"ID":"7a0a0bdc-e60b-4a14-b244-49619bde6bd6","Type":"ContainerDied","Data":"1ea24f2c2e21cead37b8d9fb0a88dac205d7adcc0522147994e2d4d97fd46278"} Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.148549 4911 scope.go:117] "RemoveContainer" containerID="bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.148955 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-xgm4j" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.149205 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8a9f0716-d281-468e-9ba9-d77699793836" containerName="cinder-scheduler" containerID="cri-o://0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765" gracePeriod=30 Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.149388 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8a9f0716-d281-468e-9ba9-d77699793836" containerName="probe" containerID="cri-o://8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae" gracePeriod=30 Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.214658 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-xgm4j"] Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.231651 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-xgm4j"] Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.276621 4911 scope.go:117] "RemoveContainer" containerID="d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.317603 4911 scope.go:117] "RemoveContainer" containerID="bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901" Mar 10 14:24:31 crc kubenswrapper[4911]: E0310 14:24:31.320696 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901\": container with ID starting with bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901 not found: ID does not exist" containerID="bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.320770 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901"} err="failed to get container status \"bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901\": rpc error: code = NotFound desc = could not find container \"bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901\": container with ID starting with bc3023faa4952faaaeaaf1144eed992b180e99de471fd04eeb8c62c28f6ba901 not found: ID does not exist" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.320806 4911 scope.go:117] "RemoveContainer" containerID="d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1" Mar 10 14:24:31 crc kubenswrapper[4911]: E0310 14:24:31.322040 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1\": container with ID starting with d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1 not found: ID does not exist" containerID="d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1" Mar 10 14:24:31 crc kubenswrapper[4911]: I0310 14:24:31.322071 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1"} err="failed to get container status \"d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1\": rpc error: code = NotFound desc = could not find container \"d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1\": container with ID starting with d56b3f6b078d63779141993be1c35ad0d89d954ef0c0efe77c513afed0a8f6e1 not found: ID does not exist" Mar 10 14:24:32 crc kubenswrapper[4911]: I0310 14:24:32.168832 4911 generic.go:334] "Generic (PLEG): container finished" podID="8a9f0716-d281-468e-9ba9-d77699793836" containerID="8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae" exitCode=0 Mar 10 14:24:32 crc kubenswrapper[4911]: I0310 14:24:32.169339 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a9f0716-d281-468e-9ba9-d77699793836","Type":"ContainerDied","Data":"8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae"} Mar 10 14:24:32 crc kubenswrapper[4911]: I0310 14:24:32.186018 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:32 crc kubenswrapper[4911]: I0310 14:24:32.210292 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" path="/var/lib/kubelet/pods/7a0a0bdc-e60b-4a14-b244-49619bde6bd6/volumes" Mar 10 14:24:32 crc kubenswrapper[4911]: I0310 14:24:32.478449 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76d846bbc6-4wr5p" Mar 10 14:24:33 crc kubenswrapper[4911]: I0310 14:24:33.716963 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-554f4c9c94-4jmq8" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 14:24:34 crc kubenswrapper[4911]: I0310 14:24:34.877914 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.026352 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a9f0716-d281-468e-9ba9-d77699793836-etc-machine-id\") pod \"8a9f0716-d281-468e-9ba9-d77699793836\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.026527 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9f0716-d281-468e-9ba9-d77699793836-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8a9f0716-d281-468e-9ba9-d77699793836" (UID: "8a9f0716-d281-468e-9ba9-d77699793836"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.026611 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data-custom\") pod \"8a9f0716-d281-468e-9ba9-d77699793836\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.026642 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data\") pod \"8a9f0716-d281-468e-9ba9-d77699793836\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.026750 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-combined-ca-bundle\") pod \"8a9f0716-d281-468e-9ba9-d77699793836\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.026904 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r4q9\" (UniqueName: \"kubernetes.io/projected/8a9f0716-d281-468e-9ba9-d77699793836-kube-api-access-5r4q9\") pod \"8a9f0716-d281-468e-9ba9-d77699793836\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.026946 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-scripts\") pod \"8a9f0716-d281-468e-9ba9-d77699793836\" (UID: \"8a9f0716-d281-468e-9ba9-d77699793836\") " Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.027461 4911 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a9f0716-d281-468e-9ba9-d77699793836-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.033936 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9f0716-d281-468e-9ba9-d77699793836-kube-api-access-5r4q9" (OuterVolumeSpecName: "kube-api-access-5r4q9") pod "8a9f0716-d281-468e-9ba9-d77699793836" (UID: "8a9f0716-d281-468e-9ba9-d77699793836"). InnerVolumeSpecName "kube-api-access-5r4q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.033946 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8a9f0716-d281-468e-9ba9-d77699793836" (UID: "8a9f0716-d281-468e-9ba9-d77699793836"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.049080 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-scripts" (OuterVolumeSpecName: "scripts") pod "8a9f0716-d281-468e-9ba9-d77699793836" (UID: "8a9f0716-d281-468e-9ba9-d77699793836"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.090824 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a9f0716-d281-468e-9ba9-d77699793836" (UID: "8a9f0716-d281-468e-9ba9-d77699793836"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.129609 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.129662 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r4q9\" (UniqueName: \"kubernetes.io/projected/8a9f0716-d281-468e-9ba9-d77699793836-kube-api-access-5r4q9\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.129679 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.129694 4911 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.182759 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data" (OuterVolumeSpecName: "config-data") pod "8a9f0716-d281-468e-9ba9-d77699793836" (UID: "8a9f0716-d281-468e-9ba9-d77699793836"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.226368 4911 generic.go:334] "Generic (PLEG): container finished" podID="8a9f0716-d281-468e-9ba9-d77699793836" containerID="0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765" exitCode=0 Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.226485 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a9f0716-d281-468e-9ba9-d77699793836","Type":"ContainerDied","Data":"0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765"} Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.226560 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a9f0716-d281-468e-9ba9-d77699793836","Type":"ContainerDied","Data":"fd222f4273d70d5deb157861b202b50db69ada7835a061c31cf93a947443b273"} Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.226586 4911 scope.go:117] "RemoveContainer" containerID="8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.226806 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.231211 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9f0716-d281-468e-9ba9-d77699793836-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.267999 4911 scope.go:117] "RemoveContainer" containerID="0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.283333 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.292439 4911 scope.go:117] "RemoveContainer" containerID="8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae" Mar 10 14:24:35 crc kubenswrapper[4911]: E0310 14:24:35.295221 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae\": container with ID starting with 8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae not found: ID does not exist" containerID="8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.295275 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae"} err="failed to get container status \"8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae\": rpc error: code = NotFound desc = could not find container \"8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae\": container with ID starting with 8dde3386a563a6edd716b3270844f993160fd8254b011b028f9fea8be41331ae not found: ID does not exist" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.295310 4911 scope.go:117] "RemoveContainer" containerID="0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765" Mar 10 14:24:35 crc kubenswrapper[4911]: E0310 14:24:35.295952 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765\": container with ID starting with 0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765 not found: ID does not exist" containerID="0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.295985 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765"} err="failed to get container status \"0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765\": rpc error: code = NotFound desc = could not find container \"0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765\": container with ID starting with 0cf2bbc1e64926a113d159ae6df9157011d2119e03784eddc420f0a6a80d5765 not found: ID does not exist" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.298330 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.308834 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:35 crc kubenswrapper[4911]: E0310 14:24:35.309403 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" containerName="init" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309429 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" containerName="init" Mar 10 14:24:35 crc kubenswrapper[4911]: E0310 14:24:35.309450 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-httpd" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309460 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-httpd" Mar 10 14:24:35 crc kubenswrapper[4911]: E0310 14:24:35.309477 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-api" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309489 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-api" Mar 10 14:24:35 crc kubenswrapper[4911]: E0310 14:24:35.309517 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" containerName="dnsmasq-dns" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309526 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" containerName="dnsmasq-dns" Mar 10 14:24:35 crc kubenswrapper[4911]: E0310 14:24:35.309539 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9f0716-d281-468e-9ba9-d77699793836" containerName="cinder-scheduler" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309547 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9f0716-d281-468e-9ba9-d77699793836" containerName="cinder-scheduler" Mar 10 14:24:35 crc kubenswrapper[4911]: E0310 14:24:35.309566 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9f0716-d281-468e-9ba9-d77699793836" containerName="probe" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309574 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9f0716-d281-468e-9ba9-d77699793836" containerName="probe" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309856 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9f0716-d281-468e-9ba9-d77699793836" containerName="cinder-scheduler" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309910 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0a0bdc-e60b-4a14-b244-49619bde6bd6" containerName="dnsmasq-dns" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309925 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-api" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309938 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="6add0c8c-ea9f-4fc7-87ad-7c58e8b4a303" containerName="neutron-httpd" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.309953 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9f0716-d281-468e-9ba9-d77699793836" containerName="probe" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.311276 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.314548 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.327556 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.436885 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-config-data\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.437017 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-scripts\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.437055 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.437154 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9z6z\" (UniqueName: \"kubernetes.io/projected/b634ed72-d485-42b9-a382-24974c25ab42-kube-api-access-p9z6z\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.437206 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.437269 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b634ed72-d485-42b9-a382-24974c25ab42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.538954 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-config-data\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.539000 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-scripts\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.539031 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.539101 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9z6z\" (UniqueName: \"kubernetes.io/projected/b634ed72-d485-42b9-a382-24974c25ab42-kube-api-access-p9z6z\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.539134 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.539179 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b634ed72-d485-42b9-a382-24974c25ab42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.539256 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b634ed72-d485-42b9-a382-24974c25ab42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.544288 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.548782 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-config-data\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.554395 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.558843 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b634ed72-d485-42b9-a382-24974c25ab42-scripts\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.566582 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9z6z\" (UniqueName: \"kubernetes.io/projected/b634ed72-d485-42b9-a382-24974c25ab42-kube-api-access-p9z6z\") pod \"cinder-scheduler-0\" (UID: \"b634ed72-d485-42b9-a382-24974c25ab42\") " pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.636201 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.815795 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.818072 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.825872 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.826296 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.827004 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6r74m" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.828873 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.854930 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b4dd68964-gfvp8" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.863477 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjxv9\" (UniqueName: \"kubernetes.io/projected/350c17be-173a-480f-bb79-314043291d4d-kube-api-access-sjxv9\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.863602 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c17be-173a-480f-bb79-314043291d4d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.863694 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/350c17be-173a-480f-bb79-314043291d4d-openstack-config\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.863776 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/350c17be-173a-480f-bb79-314043291d4d-openstack-config-secret\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.967347 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjxv9\" (UniqueName: \"kubernetes.io/projected/350c17be-173a-480f-bb79-314043291d4d-kube-api-access-sjxv9\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.967424 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c17be-173a-480f-bb79-314043291d4d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.967489 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/350c17be-173a-480f-bb79-314043291d4d-openstack-config\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.967532 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/350c17be-173a-480f-bb79-314043291d4d-openstack-config-secret\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.977542 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c17be-173a-480f-bb79-314043291d4d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.978150 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/350c17be-173a-480f-bb79-314043291d4d-openstack-config-secret\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:35 crc kubenswrapper[4911]: I0310 14:24:35.978783 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/350c17be-173a-480f-bb79-314043291d4d-openstack-config\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:36 crc kubenswrapper[4911]: I0310 14:24:36.016453 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjxv9\" (UniqueName: \"kubernetes.io/projected/350c17be-173a-480f-bb79-314043291d4d-kube-api-access-sjxv9\") pod \"openstackclient\" (UID: \"350c17be-173a-480f-bb79-314043291d4d\") " pod="openstack/openstackclient" Mar 10 14:24:36 crc kubenswrapper[4911]: I0310 14:24:36.206142 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6r74m" Mar 10 14:24:36 crc kubenswrapper[4911]: I0310 14:24:36.206513 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 14:24:36 crc kubenswrapper[4911]: I0310 14:24:36.224222 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9f0716-d281-468e-9ba9-d77699793836" path="/var/lib/kubelet/pods/8a9f0716-d281-468e-9ba9-d77699793836/volumes" Mar 10 14:24:36 crc kubenswrapper[4911]: I0310 14:24:36.298289 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-554f4c9c94-4jmq8" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:57712->10.217.0.167:9311: read: connection reset by peer" Mar 10 14:24:36 crc kubenswrapper[4911]: I0310 14:24:36.299051 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-554f4c9c94-4jmq8" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:34624->10.217.0.167:9311: read: connection reset by peer" Mar 10 14:24:36 crc kubenswrapper[4911]: I0310 14:24:36.387260 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 14:24:36 crc kubenswrapper[4911]: I0310 14:24:36.844892 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.048239 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.117775 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa83f400-7839-4e47-a578-73f2cad08034-logs\") pod \"aa83f400-7839-4e47-a578-73f2cad08034\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.118253 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-combined-ca-bundle\") pod \"aa83f400-7839-4e47-a578-73f2cad08034\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.118383 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data-custom\") pod \"aa83f400-7839-4e47-a578-73f2cad08034\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.118476 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcptr\" (UniqueName: \"kubernetes.io/projected/aa83f400-7839-4e47-a578-73f2cad08034-kube-api-access-dcptr\") pod \"aa83f400-7839-4e47-a578-73f2cad08034\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.118536 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data\") pod \"aa83f400-7839-4e47-a578-73f2cad08034\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.118542 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa83f400-7839-4e47-a578-73f2cad08034-logs" (OuterVolumeSpecName: "logs") pod "aa83f400-7839-4e47-a578-73f2cad08034" (UID: "aa83f400-7839-4e47-a578-73f2cad08034"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.119086 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa83f400-7839-4e47-a578-73f2cad08034-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.120548 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.126616 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aa83f400-7839-4e47-a578-73f2cad08034" (UID: "aa83f400-7839-4e47-a578-73f2cad08034"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.128969 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa83f400-7839-4e47-a578-73f2cad08034-kube-api-access-dcptr" (OuterVolumeSpecName: "kube-api-access-dcptr") pod "aa83f400-7839-4e47-a578-73f2cad08034" (UID: "aa83f400-7839-4e47-a578-73f2cad08034"). InnerVolumeSpecName "kube-api-access-dcptr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.178013 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa83f400-7839-4e47-a578-73f2cad08034" (UID: "aa83f400-7839-4e47-a578-73f2cad08034"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.208610 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-577d67f998-s8wh9" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.220448 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data" (OuterVolumeSpecName: "config-data") pod "aa83f400-7839-4e47-a578-73f2cad08034" (UID: "aa83f400-7839-4e47-a578-73f2cad08034"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.220912 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data\") pod \"aa83f400-7839-4e47-a578-73f2cad08034\" (UID: \"aa83f400-7839-4e47-a578-73f2cad08034\") " Mar 10 14:24:37 crc kubenswrapper[4911]: W0310 14:24:37.221971 4911 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/aa83f400-7839-4e47-a578-73f2cad08034/volumes/kubernetes.io~secret/config-data Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.222010 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data" (OuterVolumeSpecName: "config-data") pod "aa83f400-7839-4e47-a578-73f2cad08034" (UID: "aa83f400-7839-4e47-a578-73f2cad08034"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.224193 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.224226 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.224237 4911 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa83f400-7839-4e47-a578-73f2cad08034-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.224247 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcptr\" (UniqueName: \"kubernetes.io/projected/aa83f400-7839-4e47-a578-73f2cad08034-kube-api-access-dcptr\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.276570 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"350c17be-173a-480f-bb79-314043291d4d","Type":"ContainerStarted","Data":"6ceeefd7bb28cd8c4d2d8ce028acaecab96674e920168ef12cea40f1bbf8c366"} Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.278756 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b634ed72-d485-42b9-a382-24974c25ab42","Type":"ContainerStarted","Data":"2e24569586035d04ea0fde2a2ba19181f45f9d64713906884ca58163d263fc23"} Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.292027 4911 generic.go:334] "Generic (PLEG): container finished" podID="aa83f400-7839-4e47-a578-73f2cad08034" containerID="a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840" exitCode=0 Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.292169 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554f4c9c94-4jmq8" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.292153 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554f4c9c94-4jmq8" event={"ID":"aa83f400-7839-4e47-a578-73f2cad08034","Type":"ContainerDied","Data":"a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840"} Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.292269 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554f4c9c94-4jmq8" event={"ID":"aa83f400-7839-4e47-a578-73f2cad08034","Type":"ContainerDied","Data":"7c3a05044f3d6005ed67943bd424c03919bf6131f5857992b68d360b1a2611d0"} Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.292313 4911 scope.go:117] "RemoveContainer" containerID="a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.359970 4911 scope.go:117] "RemoveContainer" containerID="3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.370466 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-554f4c9c94-4jmq8"] Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.393003 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-554f4c9c94-4jmq8"] Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.410484 4911 scope.go:117] "RemoveContainer" containerID="a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840" Mar 10 14:24:37 crc kubenswrapper[4911]: E0310 14:24:37.412181 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840\": container with ID starting with a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840 not found: ID does not exist" containerID="a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.412220 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840"} err="failed to get container status \"a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840\": rpc error: code = NotFound desc = could not find container \"a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840\": container with ID starting with a883736b3d2194c567482b40bdc9ca43c5b10b6750a8bb2f93626da651dc2840 not found: ID does not exist" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.412243 4911 scope.go:117] "RemoveContainer" containerID="3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1" Mar 10 14:24:37 crc kubenswrapper[4911]: E0310 14:24:37.415932 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1\": container with ID starting with 3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1 not found: ID does not exist" containerID="3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.416098 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1"} err="failed to get container status \"3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1\": rpc error: code = NotFound desc = could not find container \"3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1\": container with ID starting with 3fe6d127ae935955af7b0703fb50ba080218d1c4324360deef3afa4504401ba1 not found: ID does not exist" Mar 10 14:24:37 crc kubenswrapper[4911]: I0310 14:24:37.462064 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 14:24:38 crc kubenswrapper[4911]: I0310 14:24:38.207202 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa83f400-7839-4e47-a578-73f2cad08034" path="/var/lib/kubelet/pods/aa83f400-7839-4e47-a578-73f2cad08034/volumes" Mar 10 14:24:38 crc kubenswrapper[4911]: I0310 14:24:38.309043 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b634ed72-d485-42b9-a382-24974c25ab42","Type":"ContainerStarted","Data":"ae2dbc2bb6b43732f4dab1474ee82d6a35b3358ab2aa01d98e63557aec29ee68"} Mar 10 14:24:38 crc kubenswrapper[4911]: I0310 14:24:38.309124 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b634ed72-d485-42b9-a382-24974c25ab42","Type":"ContainerStarted","Data":"19f5be8be1afa946f08986172b57ccbfb999147d82900335c7922b6b53bbdbb7"} Mar 10 14:24:38 crc kubenswrapper[4911]: I0310 14:24:38.334605 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.334578509 podStartE2EDuration="3.334578509s" podCreationTimestamp="2026-03-10 14:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:38.329328631 +0000 UTC m=+1382.892848558" watchObservedRunningTime="2026-03-10 14:24:38.334578509 +0000 UTC m=+1382.898098426" Mar 10 14:24:40 crc kubenswrapper[4911]: I0310 14:24:40.329348 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:40 crc kubenswrapper[4911]: I0310 14:24:40.330240 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="ceilometer-central-agent" containerID="cri-o://866faac80a64e67dacdefb7c8776027cabd266ad71e5c3c0a327a4e9279af85e" gracePeriod=30 Mar 10 14:24:40 crc kubenswrapper[4911]: I0310 14:24:40.330974 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="proxy-httpd" containerID="cri-o://ebaf072307ac2a9de2ac29ff4920cf0b86782a1efe71c408f5474c0ba69180ce" gracePeriod=30 Mar 10 14:24:40 crc kubenswrapper[4911]: I0310 14:24:40.331060 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="sg-core" containerID="cri-o://c0b1da9b0147c06e30ea28bc2aff6419b9f0935e314f93b995d03d7ef0ae6e25" gracePeriod=30 Mar 10 14:24:40 crc kubenswrapper[4911]: I0310 14:24:40.331121 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="ceilometer-notification-agent" containerID="cri-o://48fd44ee7dc5ab3ad38b7dcd6bb8303be8e6722aaacdabe481b5163f90450c29" gracePeriod=30 Mar 10 14:24:40 crc kubenswrapper[4911]: I0310 14:24:40.342369 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 14:24:40 crc kubenswrapper[4911]: I0310 14:24:40.636856 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.358358 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-69f9f96d6c-plmfc"] Mar 10 14:24:41 crc kubenswrapper[4911]: E0310 14:24:41.360970 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.360997 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api" Mar 10 14:24:41 crc kubenswrapper[4911]: E0310 14:24:41.361033 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api-log" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.361039 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api-log" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.361253 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.361282 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa83f400-7839-4e47-a578-73f2cad08034" containerName="barbican-api-log" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.362343 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.372254 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.372601 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.372766 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.415811 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69f9f96d6c-plmfc"] Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.421147 4911 generic.go:334] "Generic (PLEG): container finished" podID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerID="ebaf072307ac2a9de2ac29ff4920cf0b86782a1efe71c408f5474c0ba69180ce" exitCode=0 Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.421215 4911 generic.go:334] "Generic (PLEG): container finished" podID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerID="c0b1da9b0147c06e30ea28bc2aff6419b9f0935e314f93b995d03d7ef0ae6e25" exitCode=2 Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.421225 4911 generic.go:334] "Generic (PLEG): container finished" podID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerID="48fd44ee7dc5ab3ad38b7dcd6bb8303be8e6722aaacdabe481b5163f90450c29" exitCode=0 Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.421235 4911 generic.go:334] "Generic (PLEG): container finished" podID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerID="866faac80a64e67dacdefb7c8776027cabd266ad71e5c3c0a327a4e9279af85e" exitCode=0 Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.421273 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerDied","Data":"ebaf072307ac2a9de2ac29ff4920cf0b86782a1efe71c408f5474c0ba69180ce"} Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.421352 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerDied","Data":"c0b1da9b0147c06e30ea28bc2aff6419b9f0935e314f93b995d03d7ef0ae6e25"} Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.421368 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerDied","Data":"48fd44ee7dc5ab3ad38b7dcd6bb8303be8e6722aaacdabe481b5163f90450c29"} Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.421379 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerDied","Data":"866faac80a64e67dacdefb7c8776027cabd266ad71e5c3c0a327a4e9279af85e"} Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.445253 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8flgp\" (UniqueName: \"kubernetes.io/projected/7f8852b3-f34b-4a37-b546-b7bd6b595203-kube-api-access-8flgp\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.445322 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-internal-tls-certs\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.445342 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-combined-ca-bundle\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.445366 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-config-data\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.445396 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f8852b3-f34b-4a37-b546-b7bd6b595203-log-httpd\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.445423 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f8852b3-f34b-4a37-b546-b7bd6b595203-run-httpd\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.445461 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f8852b3-f34b-4a37-b546-b7bd6b595203-etc-swift\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.445508 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-public-tls-certs\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.457922 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.546444 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-scripts\") pod \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547093 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-combined-ca-bundle\") pod \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547181 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-sg-core-conf-yaml\") pod \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547277 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm78x\" (UniqueName: \"kubernetes.io/projected/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-kube-api-access-qm78x\") pod \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547305 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-run-httpd\") pod \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547410 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-log-httpd\") pod \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547447 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-config-data\") pod \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\" (UID: \"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526\") " Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547836 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8flgp\" (UniqueName: \"kubernetes.io/projected/7f8852b3-f34b-4a37-b546-b7bd6b595203-kube-api-access-8flgp\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547886 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-combined-ca-bundle\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547902 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-internal-tls-certs\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547924 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-config-data\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547949 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f8852b3-f34b-4a37-b546-b7bd6b595203-log-httpd\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.547976 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f8852b3-f34b-4a37-b546-b7bd6b595203-run-httpd\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.548015 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f8852b3-f34b-4a37-b546-b7bd6b595203-etc-swift\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.548059 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-public-tls-certs\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.550884 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" (UID: "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.553082 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" (UID: "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.554809 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f8852b3-f34b-4a37-b546-b7bd6b595203-run-httpd\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.556432 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f8852b3-f34b-4a37-b546-b7bd6b595203-log-httpd\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.568778 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-internal-tls-certs\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.569863 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f8852b3-f34b-4a37-b546-b7bd6b595203-etc-swift\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.574317 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-config-data\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.577944 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-scripts" (OuterVolumeSpecName: "scripts") pod "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" (UID: "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.590642 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-kube-api-access-qm78x" (OuterVolumeSpecName: "kube-api-access-qm78x") pod "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" (UID: "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526"). InnerVolumeSpecName "kube-api-access-qm78x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.592919 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8flgp\" (UniqueName: \"kubernetes.io/projected/7f8852b3-f34b-4a37-b546-b7bd6b595203-kube-api-access-8flgp\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.595257 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-public-tls-certs\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.598941 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8852b3-f34b-4a37-b546-b7bd6b595203-combined-ca-bundle\") pod \"swift-proxy-69f9f96d6c-plmfc\" (UID: \"7f8852b3-f34b-4a37-b546-b7bd6b595203\") " pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.619988 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" (UID: "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.651749 4911 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.651788 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.651799 4911 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.651814 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm78x\" (UniqueName: \"kubernetes.io/projected/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-kube-api-access-qm78x\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.651824 4911 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.715081 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" (UID: "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.734807 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.748829 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-config-data" (OuterVolumeSpecName: "config-data") pod "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" (UID: "c4c38e4e-e3cd-48d6-bbc4-3014a10f7526"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.754462 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:41 crc kubenswrapper[4911]: I0310 14:24:41.754509 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.345463 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69f9f96d6c-plmfc"] Mar 10 14:24:42 crc kubenswrapper[4911]: W0310 14:24:42.359676 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8852b3_f34b_4a37_b546_b7bd6b595203.slice/crio-c2694f48541e6bdb6e748b3d46e961c9ccf98284fed2bf3f0369080d7daff929 WatchSource:0}: Error finding container c2694f48541e6bdb6e748b3d46e961c9ccf98284fed2bf3f0369080d7daff929: Status 404 returned error can't find the container with id c2694f48541e6bdb6e748b3d46e961c9ccf98284fed2bf3f0369080d7daff929 Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.439790 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f9f96d6c-plmfc" event={"ID":"7f8852b3-f34b-4a37-b546-b7bd6b595203","Type":"ContainerStarted","Data":"c2694f48541e6bdb6e748b3d46e961c9ccf98284fed2bf3f0369080d7daff929"} Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.450687 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4c38e4e-e3cd-48d6-bbc4-3014a10f7526","Type":"ContainerDied","Data":"bc8dee9ba930cb700639508b1a67c7c3ff11f7de49ba5186ff2ead05faff6b12"} Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.450778 4911 scope.go:117] "RemoveContainer" containerID="ebaf072307ac2a9de2ac29ff4920cf0b86782a1efe71c408f5474c0ba69180ce" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.450843 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.488044 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.495760 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.529802 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:42 crc kubenswrapper[4911]: E0310 14:24:42.531224 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="proxy-httpd" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.531333 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="proxy-httpd" Mar 10 14:24:42 crc kubenswrapper[4911]: E0310 14:24:42.531414 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="ceilometer-notification-agent" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.531481 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="ceilometer-notification-agent" Mar 10 14:24:42 crc kubenswrapper[4911]: E0310 14:24:42.531620 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="ceilometer-central-agent" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.531678 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="ceilometer-central-agent" Mar 10 14:24:42 crc kubenswrapper[4911]: E0310 14:24:42.531784 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="sg-core" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.531839 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="sg-core" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.532131 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="ceilometer-central-agent" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.532712 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="ceilometer-notification-agent" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.532805 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="sg-core" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.532865 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" containerName="proxy-httpd" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.536306 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.539532 4911 scope.go:117] "RemoveContainer" containerID="c0b1da9b0147c06e30ea28bc2aff6419b9f0935e314f93b995d03d7ef0ae6e25" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.540802 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.541273 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.543805 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.626226 4911 scope.go:117] "RemoveContainer" containerID="48fd44ee7dc5ab3ad38b7dcd6bb8303be8e6722aaacdabe481b5163f90450c29" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.662522 4911 scope.go:117] "RemoveContainer" containerID="866faac80a64e67dacdefb7c8776027cabd266ad71e5c3c0a327a4e9279af85e" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.682707 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-config-data\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.682801 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-log-httpd\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.682868 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.682893 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-run-httpd\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.682910 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.682933 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp5w2\" (UniqueName: \"kubernetes.io/projected/16236160-b958-4733-a1f7-b3bfe8aeac93-kube-api-access-lp5w2\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.682989 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-scripts\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.784531 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.784589 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-run-httpd\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.784609 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.784629 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp5w2\" (UniqueName: \"kubernetes.io/projected/16236160-b958-4733-a1f7-b3bfe8aeac93-kube-api-access-lp5w2\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.784695 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-scripts\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.784761 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-config-data\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.784806 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-log-httpd\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.785324 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-run-httpd\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.785347 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-log-httpd\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.794402 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.796161 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-config-data\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.798288 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-scripts\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.798476 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.803361 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp5w2\" (UniqueName: \"kubernetes.io/projected/16236160-b958-4733-a1f7-b3bfe8aeac93-kube-api-access-lp5w2\") pod \"ceilometer-0\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " pod="openstack/ceilometer-0" Mar 10 14:24:42 crc kubenswrapper[4911]: I0310 14:24:42.931914 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:24:43 crc kubenswrapper[4911]: I0310 14:24:43.475435 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f9f96d6c-plmfc" event={"ID":"7f8852b3-f34b-4a37-b546-b7bd6b595203","Type":"ContainerStarted","Data":"bf25c6e630bbaee502657b4e37e46b2513f9e56b9fd5ad15d14345c014bddde7"} Mar 10 14:24:43 crc kubenswrapper[4911]: I0310 14:24:43.475504 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f9f96d6c-plmfc" event={"ID":"7f8852b3-f34b-4a37-b546-b7bd6b595203","Type":"ContainerStarted","Data":"90553390b46292a69defcf955ab471be7c0b4a1a1b9fb2082a68f887889da471"} Mar 10 14:24:43 crc kubenswrapper[4911]: I0310 14:24:43.475571 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:43 crc kubenswrapper[4911]: I0310 14:24:43.500861 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-69f9f96d6c-plmfc" podStartSLOduration=2.500826048 podStartE2EDuration="2.500826048s" podCreationTimestamp="2026-03-10 14:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:43.498047465 +0000 UTC m=+1388.061567382" watchObservedRunningTime="2026-03-10 14:24:43.500826048 +0000 UTC m=+1388.064345965" Mar 10 14:24:44 crc kubenswrapper[4911]: I0310 14:24:44.204528 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c38e4e-e3cd-48d6-bbc4-3014a10f7526" path="/var/lib/kubelet/pods/c4c38e4e-e3cd-48d6-bbc4-3014a10f7526/volumes" Mar 10 14:24:44 crc kubenswrapper[4911]: I0310 14:24:44.543223 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:45 crc kubenswrapper[4911]: I0310 14:24:45.861962 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b4dd68964-gfvp8" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 10 14:24:45 crc kubenswrapper[4911]: I0310 14:24:45.914446 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 14:24:46 crc kubenswrapper[4911]: I0310 14:24:46.437710 4911 scope.go:117] "RemoveContainer" containerID="b901f388fd87ff94d84fda609875a64624b2019071ed736af354521df410dab5" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.223885 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-wdsg2"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.225321 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.226710 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wdsg2"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.311008 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt5m6\" (UniqueName: \"kubernetes.io/projected/41e63a1e-e526-498d-b14f-0720657e6c30-kube-api-access-vt5m6\") pod \"nova-api-db-create-wdsg2\" (UID: \"41e63a1e-e526-498d-b14f-0720657e6c30\") " pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.311211 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e63a1e-e526-498d-b14f-0720657e6c30-operator-scripts\") pod \"nova-api-db-create-wdsg2\" (UID: \"41e63a1e-e526-498d-b14f-0720657e6c30\") " pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.351917 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-lwpcs"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.355108 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.403000 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d929-account-create-update-z2j9n"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.419449 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.422304 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-operator-scripts\") pod \"nova-cell0-db-create-lwpcs\" (UID: \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\") " pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.422374 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e63a1e-e526-498d-b14f-0720657e6c30-operator-scripts\") pod \"nova-api-db-create-wdsg2\" (UID: \"41e63a1e-e526-498d-b14f-0720657e6c30\") " pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.422478 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb5sj\" (UniqueName: \"kubernetes.io/projected/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-kube-api-access-zb5sj\") pod \"nova-cell0-db-create-lwpcs\" (UID: \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\") " pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.422562 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt5m6\" (UniqueName: \"kubernetes.io/projected/41e63a1e-e526-498d-b14f-0720657e6c30-kube-api-access-vt5m6\") pod \"nova-api-db-create-wdsg2\" (UID: \"41e63a1e-e526-498d-b14f-0720657e6c30\") " pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.423350 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.423610 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e63a1e-e526-498d-b14f-0720657e6c30-operator-scripts\") pod \"nova-api-db-create-wdsg2\" (UID: \"41e63a1e-e526-498d-b14f-0720657e6c30\") " pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.432635 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lwpcs"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.449026 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d929-account-create-update-z2j9n"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.468666 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt5m6\" (UniqueName: \"kubernetes.io/projected/41e63a1e-e526-498d-b14f-0720657e6c30-kube-api-access-vt5m6\") pod \"nova-api-db-create-wdsg2\" (UID: \"41e63a1e-e526-498d-b14f-0720657e6c30\") " pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.517466 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-m9ftl"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.519245 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.535273 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-operator-scripts\") pod \"nova-cell0-db-create-lwpcs\" (UID: \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\") " pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.535375 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5sj\" (UniqueName: \"kubernetes.io/projected/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-kube-api-access-zb5sj\") pod \"nova-cell0-db-create-lwpcs\" (UID: \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\") " pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.535507 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhgfq\" (UniqueName: \"kubernetes.io/projected/20d998cd-6fc6-498b-9a65-18dbf5931cc3-kube-api-access-nhgfq\") pod \"nova-api-d929-account-create-update-z2j9n\" (UID: \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\") " pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.535561 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20d998cd-6fc6-498b-9a65-18dbf5931cc3-operator-scripts\") pod \"nova-api-d929-account-create-update-z2j9n\" (UID: \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\") " pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.536625 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-operator-scripts\") pod \"nova-cell0-db-create-lwpcs\" (UID: \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\") " pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.545892 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5c94-account-create-update-9z2xg"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.547214 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.549537 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.551274 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.556221 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m9ftl"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.571605 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5c94-account-create-update-9z2xg"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.589582 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5sj\" (UniqueName: \"kubernetes.io/projected/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-kube-api-access-zb5sj\") pod \"nova-cell0-db-create-lwpcs\" (UID: \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\") " pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.637904 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhgfq\" (UniqueName: \"kubernetes.io/projected/20d998cd-6fc6-498b-9a65-18dbf5931cc3-kube-api-access-nhgfq\") pod \"nova-api-d929-account-create-update-z2j9n\" (UID: \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\") " pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.637994 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2zr5\" (UniqueName: \"kubernetes.io/projected/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-kube-api-access-p2zr5\") pod \"nova-cell0-5c94-account-create-update-9z2xg\" (UID: \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\") " pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.638042 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b4b396-5ac9-490b-a59e-b568ca3cb638-operator-scripts\") pod \"nova-cell1-db-create-m9ftl\" (UID: \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\") " pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.638077 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20d998cd-6fc6-498b-9a65-18dbf5931cc3-operator-scripts\") pod \"nova-api-d929-account-create-update-z2j9n\" (UID: \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\") " pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.638327 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsnhb\" (UniqueName: \"kubernetes.io/projected/f9b4b396-5ac9-490b-a59e-b568ca3cb638-kube-api-access-nsnhb\") pod \"nova-cell1-db-create-m9ftl\" (UID: \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\") " pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.638381 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-operator-scripts\") pod \"nova-cell0-5c94-account-create-update-9z2xg\" (UID: \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\") " pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.638913 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20d998cd-6fc6-498b-9a65-18dbf5931cc3-operator-scripts\") pod \"nova-api-d929-account-create-update-z2j9n\" (UID: \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\") " pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.662250 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhgfq\" (UniqueName: \"kubernetes.io/projected/20d998cd-6fc6-498b-9a65-18dbf5931cc3-kube-api-access-nhgfq\") pod \"nova-api-d929-account-create-update-z2j9n\" (UID: \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\") " pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.707672 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.723127 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c623-account-create-update-pzwxt"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.724468 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.726564 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.739333 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c623-account-create-update-pzwxt"] Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.740631 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2zr5\" (UniqueName: \"kubernetes.io/projected/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-kube-api-access-p2zr5\") pod \"nova-cell0-5c94-account-create-update-9z2xg\" (UID: \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\") " pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.740678 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b4b396-5ac9-490b-a59e-b568ca3cb638-operator-scripts\") pod \"nova-cell1-db-create-m9ftl\" (UID: \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\") " pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.740775 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsnhb\" (UniqueName: \"kubernetes.io/projected/f9b4b396-5ac9-490b-a59e-b568ca3cb638-kube-api-access-nsnhb\") pod \"nova-cell1-db-create-m9ftl\" (UID: \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\") " pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.740811 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-operator-scripts\") pod \"nova-cell0-5c94-account-create-update-9z2xg\" (UID: \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\") " pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.741572 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-operator-scripts\") pod \"nova-cell0-5c94-account-create-update-9z2xg\" (UID: \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\") " pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.742717 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b4b396-5ac9-490b-a59e-b568ca3cb638-operator-scripts\") pod \"nova-cell1-db-create-m9ftl\" (UID: \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\") " pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.746674 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.781007 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsnhb\" (UniqueName: \"kubernetes.io/projected/f9b4b396-5ac9-490b-a59e-b568ca3cb638-kube-api-access-nsnhb\") pod \"nova-cell1-db-create-m9ftl\" (UID: \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\") " pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.796425 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2zr5\" (UniqueName: \"kubernetes.io/projected/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-kube-api-access-p2zr5\") pod \"nova-cell0-5c94-account-create-update-9z2xg\" (UID: \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\") " pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.842407 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0b02600-8a57-4944-9f93-cdf74f7dad84-operator-scripts\") pod \"nova-cell1-c623-account-create-update-pzwxt\" (UID: \"e0b02600-8a57-4944-9f93-cdf74f7dad84\") " pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.843090 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvfq\" (UniqueName: \"kubernetes.io/projected/e0b02600-8a57-4944-9f93-cdf74f7dad84-kube-api-access-jbvfq\") pod \"nova-cell1-c623-account-create-update-pzwxt\" (UID: \"e0b02600-8a57-4944-9f93-cdf74f7dad84\") " pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.871064 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.926676 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.946390 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0b02600-8a57-4944-9f93-cdf74f7dad84-operator-scripts\") pod \"nova-cell1-c623-account-create-update-pzwxt\" (UID: \"e0b02600-8a57-4944-9f93-cdf74f7dad84\") " pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.946745 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvfq\" (UniqueName: \"kubernetes.io/projected/e0b02600-8a57-4944-9f93-cdf74f7dad84-kube-api-access-jbvfq\") pod \"nova-cell1-c623-account-create-update-pzwxt\" (UID: \"e0b02600-8a57-4944-9f93-cdf74f7dad84\") " pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.947647 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0b02600-8a57-4944-9f93-cdf74f7dad84-operator-scripts\") pod \"nova-cell1-c623-account-create-update-pzwxt\" (UID: \"e0b02600-8a57-4944-9f93-cdf74f7dad84\") " pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:47 crc kubenswrapper[4911]: I0310 14:24:47.965011 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvfq\" (UniqueName: \"kubernetes.io/projected/e0b02600-8a57-4944-9f93-cdf74f7dad84-kube-api-access-jbvfq\") pod \"nova-cell1-c623-account-create-update-pzwxt\" (UID: \"e0b02600-8a57-4944-9f93-cdf74f7dad84\") " pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:48 crc kubenswrapper[4911]: I0310 14:24:48.047486 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:48 crc kubenswrapper[4911]: I0310 14:24:48.520778 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:24:48 crc kubenswrapper[4911]: I0310 14:24:48.520896 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:24:48 crc kubenswrapper[4911]: I0310 14:24:48.895454 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d48f5c7d5-2xxzq" Mar 10 14:24:48 crc kubenswrapper[4911]: I0310 14:24:48.980690 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cf9dfcf44-sls8z"] Mar 10 14:24:48 crc kubenswrapper[4911]: I0310 14:24:48.981019 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cf9dfcf44-sls8z" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerName="neutron-api" containerID="cri-o://95be14fb3901116d3d2f49bbf6cd4ceb984aa453ad533d64f331c09118428d98" gracePeriod=30 Mar 10 14:24:48 crc kubenswrapper[4911]: I0310 14:24:48.981586 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cf9dfcf44-sls8z" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerName="neutron-httpd" containerID="cri-o://401715d98a839e18293dc3f8c3e947d7a73c27c3071b88d923203d15ef2cca0b" gracePeriod=30 Mar 10 14:24:49 crc kubenswrapper[4911]: I0310 14:24:49.084693 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:24:49 crc kubenswrapper[4911]: I0310 14:24:49.085472 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerName="glance-log" containerID="cri-o://09a287c0af9ac56378919e06cbb2868faa3a49db885c95daf22c889f9f356e32" gracePeriod=30 Mar 10 14:24:49 crc kubenswrapper[4911]: I0310 14:24:49.085869 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerName="glance-httpd" containerID="cri-o://d0a2307df180200bd46740c366eaa709060005f76fa90802cb4fb9cd917ba67b" gracePeriod=30 Mar 10 14:24:49 crc kubenswrapper[4911]: I0310 14:24:49.407769 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:49 crc kubenswrapper[4911]: I0310 14:24:49.646418 4911 generic.go:334] "Generic (PLEG): container finished" podID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerID="09a287c0af9ac56378919e06cbb2868faa3a49db885c95daf22c889f9f356e32" exitCode=143 Mar 10 14:24:49 crc kubenswrapper[4911]: I0310 14:24:49.646558 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b58550bb-f4f6-40ef-9424-7bfd05d57c9d","Type":"ContainerDied","Data":"09a287c0af9ac56378919e06cbb2868faa3a49db885c95daf22c889f9f356e32"} Mar 10 14:24:49 crc kubenswrapper[4911]: I0310 14:24:49.662917 4911 generic.go:334] "Generic (PLEG): container finished" podID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerID="401715d98a839e18293dc3f8c3e947d7a73c27c3071b88d923203d15ef2cca0b" exitCode=0 Mar 10 14:24:49 crc kubenswrapper[4911]: I0310 14:24:49.662975 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf9dfcf44-sls8z" event={"ID":"4b96f0e8-39e3-4018-bb9e-75b83d333bd4","Type":"ContainerDied","Data":"401715d98a839e18293dc3f8c3e947d7a73c27c3071b88d923203d15ef2cca0b"} Mar 10 14:24:50 crc kubenswrapper[4911]: I0310 14:24:50.003598 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:24:50 crc kubenswrapper[4911]: I0310 14:24:50.004006 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerName="glance-log" containerID="cri-o://5a23957fa981e06933df72f1bb48cabd3f01029020258904eed5bd94e0c03126" gracePeriod=30 Mar 10 14:24:50 crc kubenswrapper[4911]: I0310 14:24:50.004157 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerName="glance-httpd" containerID="cri-o://dd9670e57c5cef170cc3de9608ba90e388a6380241c3fe1a1378fdc2169ceb0b" gracePeriod=30 Mar 10 14:24:50 crc kubenswrapper[4911]: I0310 14:24:50.675066 4911 generic.go:334] "Generic (PLEG): container finished" podID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerID="5a23957fa981e06933df72f1bb48cabd3f01029020258904eed5bd94e0c03126" exitCode=143 Mar 10 14:24:50 crc kubenswrapper[4911]: I0310 14:24:50.675123 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4428f0a9-5fdc-4bdb-931e-da67fb9498c9","Type":"ContainerDied","Data":"5a23957fa981e06933df72f1bb48cabd3f01029020258904eed5bd94e0c03126"} Mar 10 14:24:51 crc kubenswrapper[4911]: I0310 14:24:51.471962 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:24:51 crc kubenswrapper[4911]: W0310 14:24:51.538849 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16236160_b958_4733_a1f7_b3bfe8aeac93.slice/crio-24f67465cd59d7b6929045b85ae26632cae0cb19ca51aa294c4d2eefd1c4030e WatchSource:0}: Error finding container 24f67465cd59d7b6929045b85ae26632cae0cb19ca51aa294c4d2eefd1c4030e: Status 404 returned error can't find the container with id 24f67465cd59d7b6929045b85ae26632cae0cb19ca51aa294c4d2eefd1c4030e Mar 10 14:24:51 crc kubenswrapper[4911]: I0310 14:24:51.730914 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m9ftl"] Mar 10 14:24:51 crc kubenswrapper[4911]: I0310 14:24:51.731129 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerStarted","Data":"24f67465cd59d7b6929045b85ae26632cae0cb19ca51aa294c4d2eefd1c4030e"} Mar 10 14:24:51 crc kubenswrapper[4911]: I0310 14:24:51.749179 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:51 crc kubenswrapper[4911]: I0310 14:24:51.753947 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5c94-account-create-update-9z2xg"] Mar 10 14:24:51 crc kubenswrapper[4911]: I0310 14:24:51.759458 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69f9f96d6c-plmfc" Mar 10 14:24:51 crc kubenswrapper[4911]: I0310 14:24:51.793691 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wdsg2"] Mar 10 14:24:51 crc kubenswrapper[4911]: I0310 14:24:51.981216 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lwpcs"] Mar 10 14:24:52 crc kubenswrapper[4911]: W0310 14:24:52.001014 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34e0ca1a_a12e_45bd_8adc_3852606ec8a9.slice/crio-d95578050253791c9183fbff0165f22bb005d81339c8e40793fca4955290f49a WatchSource:0}: Error finding container d95578050253791c9183fbff0165f22bb005d81339c8e40793fca4955290f49a: Status 404 returned error can't find the container with id d95578050253791c9183fbff0165f22bb005d81339c8e40793fca4955290f49a Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.293473 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d929-account-create-update-z2j9n"] Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.294165 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c623-account-create-update-pzwxt"] Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.656604 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.769534 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" event={"ID":"e0b02600-8a57-4944-9f93-cdf74f7dad84","Type":"ContainerStarted","Data":"051e07b5e5787eba2f639031817bf7c502ed899b72406e551a99d22edbd6ecc6"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.769597 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" event={"ID":"e0b02600-8a57-4944-9f93-cdf74f7dad84","Type":"ContainerStarted","Data":"4d51b2afa1d65ef6a953cdfba65f6c7eb7f4287af888283e1c62ba894f165fe6"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.784027 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lwpcs" event={"ID":"34e0ca1a-a12e-45bd-8adc-3852606ec8a9","Type":"ContainerStarted","Data":"bcbc688113f8cb2ee00b0d735b3165ebf42dac0e7b4cddd1e0c84076a0695e3b"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.784078 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lwpcs" event={"ID":"34e0ca1a-a12e-45bd-8adc-3852606ec8a9","Type":"ContainerStarted","Data":"d95578050253791c9183fbff0165f22bb005d81339c8e40793fca4955290f49a"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.814056 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-secret-key\") pod \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.814220 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-config-data\") pod \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.814268 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-tls-certs\") pod \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.814301 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlzl6\" (UniqueName: \"kubernetes.io/projected/a546f2b5-3536-4608-b1f4-0127ebd52bfa-kube-api-access-zlzl6\") pod \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.814335 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-combined-ca-bundle\") pod \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.814872 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a546f2b5-3536-4608-b1f4-0127ebd52bfa-logs\") pod \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.814914 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-scripts\") pod \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\" (UID: \"a546f2b5-3536-4608-b1f4-0127ebd52bfa\") " Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.821225 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerStarted","Data":"9a029a5175990d2a360f99ceee30bf164bee47c919135c6a59d41b29117e09d8"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.823500 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a546f2b5-3536-4608-b1f4-0127ebd52bfa-logs" (OuterVolumeSpecName: "logs") pod "a546f2b5-3536-4608-b1f4-0127ebd52bfa" (UID: "a546f2b5-3536-4608-b1f4-0127ebd52bfa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.839295 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" podStartSLOduration=5.839271467 podStartE2EDuration="5.839271467s" podCreationTimestamp="2026-03-10 14:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:52.812873468 +0000 UTC m=+1397.376393405" watchObservedRunningTime="2026-03-10 14:24:52.839271467 +0000 UTC m=+1397.402791384" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.842036 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wdsg2" event={"ID":"41e63a1e-e526-498d-b14f-0720657e6c30","Type":"ContainerStarted","Data":"0f625dededc2b14e4a81dee42d61239a44f8452622e2682478cf067aa3a2b125"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.842084 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wdsg2" event={"ID":"41e63a1e-e526-498d-b14f-0720657e6c30","Type":"ContainerStarted","Data":"b28bb771feb60b8011a64ef12cebd387f14af156c11e52e5f3f4c29aa340fe66"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.849286 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a546f2b5-3536-4608-b1f4-0127ebd52bfa" (UID: "a546f2b5-3536-4608-b1f4-0127ebd52bfa"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.851365 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"350c17be-173a-480f-bb79-314043291d4d","Type":"ContainerStarted","Data":"f3e9b3950a01613414c5012d35674e11992895b7bcd2cc2d658571703a0a20d2"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.880790 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a546f2b5-3536-4608-b1f4-0127ebd52bfa-kube-api-access-zlzl6" (OuterVolumeSpecName: "kube-api-access-zlzl6") pod "a546f2b5-3536-4608-b1f4-0127ebd52bfa" (UID: "a546f2b5-3536-4608-b1f4-0127ebd52bfa"). InnerVolumeSpecName "kube-api-access-zlzl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.886204 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d929-account-create-update-z2j9n" event={"ID":"20d998cd-6fc6-498b-9a65-18dbf5931cc3","Type":"ContainerStarted","Data":"4a8a84684bed8d4e40666e945d53ca5a61b3dfc8972b3eb66b59e700a01915f0"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.886274 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d929-account-create-update-z2j9n" event={"ID":"20d998cd-6fc6-498b-9a65-18dbf5931cc3","Type":"ContainerStarted","Data":"008c08817339ac7a977833d987d475be659ccd58fff480c660ef153d0c53976e"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.921156 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-config-data" (OuterVolumeSpecName: "config-data") pod "a546f2b5-3536-4608-b1f4-0127ebd52bfa" (UID: "a546f2b5-3536-4608-b1f4-0127ebd52bfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.922643 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a546f2b5-3536-4608-b1f4-0127ebd52bfa-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.922664 4911 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.922676 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.922685 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlzl6\" (UniqueName: \"kubernetes.io/projected/a546f2b5-3536-4608-b1f4-0127ebd52bfa-kube-api-access-zlzl6\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.924150 4911 generic.go:334] "Generic (PLEG): container finished" podID="f9b4b396-5ac9-490b-a59e-b568ca3cb638" containerID="c1064abdb392bc6ef5a31b8771d7552400a6cb1290ede03b00c41caa11337366" exitCode=0 Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.924282 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m9ftl" event={"ID":"f9b4b396-5ac9-490b-a59e-b568ca3cb638","Type":"ContainerDied","Data":"c1064abdb392bc6ef5a31b8771d7552400a6cb1290ede03b00c41caa11337366"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.924324 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m9ftl" event={"ID":"f9b4b396-5ac9-490b-a59e-b568ca3cb638","Type":"ContainerStarted","Data":"3c919c76b04bb3c8057925673f278234357446bc53d0ef58f69f5b4d1deb7777"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.927618 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.839726884 podStartE2EDuration="17.927582204s" podCreationTimestamp="2026-03-10 14:24:35 +0000 UTC" firstStartedPulling="2026-03-10 14:24:36.914160963 +0000 UTC m=+1381.477680880" lastFinishedPulling="2026-03-10 14:24:51.002016283 +0000 UTC m=+1395.565536200" observedRunningTime="2026-03-10 14:24:52.920979922 +0000 UTC m=+1397.484499859" watchObservedRunningTime="2026-03-10 14:24:52.927582204 +0000 UTC m=+1397.491102121" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.944612 4911 generic.go:334] "Generic (PLEG): container finished" podID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerID="d0a2307df180200bd46740c366eaa709060005f76fa90802cb4fb9cd917ba67b" exitCode=0 Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.944691 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b58550bb-f4f6-40ef-9424-7bfd05d57c9d","Type":"ContainerDied","Data":"d0a2307df180200bd46740c366eaa709060005f76fa90802cb4fb9cd917ba67b"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.948343 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" event={"ID":"2e53f4fb-9bff-46a8-a447-e19ee4777c3b","Type":"ContainerStarted","Data":"73f076ceb280a0cd17667cc0386ae710c3b52ce5e9251dc76fbc7f827913d7cd"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.948372 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" event={"ID":"2e53f4fb-9bff-46a8-a447-e19ee4777c3b","Type":"ContainerStarted","Data":"ac55d48c03b833a442de116450e728ee1d5d14eea60f923a13de218c42e64f42"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.955427 4911 generic.go:334] "Generic (PLEG): container finished" podID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerID="33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792" exitCode=137 Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.955760 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b4dd68964-gfvp8" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.956007 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4dd68964-gfvp8" event={"ID":"a546f2b5-3536-4608-b1f4-0127ebd52bfa","Type":"ContainerDied","Data":"33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.956034 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b4dd68964-gfvp8" event={"ID":"a546f2b5-3536-4608-b1f4-0127ebd52bfa","Type":"ContainerDied","Data":"b5b48fac898470bc81f43e5d81e9ece261227a7f678074e6d5b7d82735e90ea8"} Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.956057 4911 scope.go:117] "RemoveContainer" containerID="dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.971542 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-d929-account-create-update-z2j9n" podStartSLOduration=5.971492051 podStartE2EDuration="5.971492051s" podCreationTimestamp="2026-03-10 14:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:52.9393004 +0000 UTC m=+1397.502820317" watchObservedRunningTime="2026-03-10 14:24:52.971492051 +0000 UTC m=+1397.535011968" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.974253 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a546f2b5-3536-4608-b1f4-0127ebd52bfa" (UID: "a546f2b5-3536-4608-b1f4-0127ebd52bfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:52 crc kubenswrapper[4911]: I0310 14:24:52.974343 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a546f2b5-3536-4608-b1f4-0127ebd52bfa" (UID: "a546f2b5-3536-4608-b1f4-0127ebd52bfa"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.006559 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-scripts" (OuterVolumeSpecName: "scripts") pod "a546f2b5-3536-4608-b1f4-0127ebd52bfa" (UID: "a546f2b5-3536-4608-b1f4-0127ebd52bfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.026832 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a546f2b5-3536-4608-b1f4-0127ebd52bfa-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.027124 4911 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.027137 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546f2b5-3536-4608-b1f4-0127ebd52bfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.054094 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" podStartSLOduration=6.054056528 podStartE2EDuration="6.054056528s" podCreationTimestamp="2026-03-10 14:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:52.996119765 +0000 UTC m=+1397.559639682" watchObservedRunningTime="2026-03-10 14:24:53.054056528 +0000 UTC m=+1397.617576455" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.381010 4911 scope.go:117] "RemoveContainer" containerID="33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.406534 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.430218 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b4dd68964-gfvp8"] Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.441445 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-scripts\") pod \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.441537 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-public-tls-certs\") pod \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.441570 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-combined-ca-bundle\") pod \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.441596 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhz5p\" (UniqueName: \"kubernetes.io/projected/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-kube-api-access-fhz5p\") pod \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.441624 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-httpd-run\") pod \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.441672 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.441702 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-config-data\") pod \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.441767 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-logs\") pod \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\" (UID: \"b58550bb-f4f6-40ef-9424-7bfd05d57c9d\") " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.442778 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-logs" (OuterVolumeSpecName: "logs") pod "b58550bb-f4f6-40ef-9424-7bfd05d57c9d" (UID: "b58550bb-f4f6-40ef-9424-7bfd05d57c9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.443618 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b58550bb-f4f6-40ef-9424-7bfd05d57c9d" (UID: "b58550bb-f4f6-40ef-9424-7bfd05d57c9d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.443665 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b4dd68964-gfvp8"] Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.444349 4911 scope.go:117] "RemoveContainer" containerID="dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.447898 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-kube-api-access-fhz5p" (OuterVolumeSpecName: "kube-api-access-fhz5p") pod "b58550bb-f4f6-40ef-9424-7bfd05d57c9d" (UID: "b58550bb-f4f6-40ef-9424-7bfd05d57c9d"). InnerVolumeSpecName "kube-api-access-fhz5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.449509 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "b58550bb-f4f6-40ef-9424-7bfd05d57c9d" (UID: "b58550bb-f4f6-40ef-9424-7bfd05d57c9d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: E0310 14:24:53.449566 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a\": container with ID starting with dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a not found: ID does not exist" containerID="dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.449623 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a"} err="failed to get container status \"dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a\": rpc error: code = NotFound desc = could not find container \"dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a\": container with ID starting with dd05861b5b9fbd3f03f67e77f962fc5c1860ebacc9784bf26f716b52b007249a not found: ID does not exist" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.449672 4911 scope.go:117] "RemoveContainer" containerID="33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.454214 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-scripts" (OuterVolumeSpecName: "scripts") pod "b58550bb-f4f6-40ef-9424-7bfd05d57c9d" (UID: "b58550bb-f4f6-40ef-9424-7bfd05d57c9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: E0310 14:24:53.454878 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792\": container with ID starting with 33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792 not found: ID does not exist" containerID="33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.454928 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792"} err="failed to get container status \"33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792\": rpc error: code = NotFound desc = could not find container \"33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792\": container with ID starting with 33b6a8a9593cfd89d4cc52aaa1c7607c1bde048d1fb5fb116ef83c8e95c2a792 not found: ID does not exist" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.503647 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b58550bb-f4f6-40ef-9424-7bfd05d57c9d" (UID: "b58550bb-f4f6-40ef-9424-7bfd05d57c9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.531013 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-config-data" (OuterVolumeSpecName: "config-data") pod "b58550bb-f4f6-40ef-9424-7bfd05d57c9d" (UID: "b58550bb-f4f6-40ef-9424-7bfd05d57c9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.544788 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.545117 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.545129 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.545141 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhz5p\" (UniqueName: \"kubernetes.io/projected/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-kube-api-access-fhz5p\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.545152 4911 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.545180 4911 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.545190 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.553671 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b58550bb-f4f6-40ef-9424-7bfd05d57c9d" (UID: "b58550bb-f4f6-40ef-9424-7bfd05d57c9d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.592316 4911 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.647563 4911 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58550bb-f4f6-40ef-9424-7bfd05d57c9d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.647611 4911 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.970079 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b58550bb-f4f6-40ef-9424-7bfd05d57c9d","Type":"ContainerDied","Data":"22aa495f77854b28791c6198eef5f790b40e5b72fbf7ac5106488db6c05cab8e"} Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.971661 4911 scope.go:117] "RemoveContainer" containerID="d0a2307df180200bd46740c366eaa709060005f76fa90802cb4fb9cd917ba67b" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.971968 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.978113 4911 generic.go:334] "Generic (PLEG): container finished" podID="20d998cd-6fc6-498b-9a65-18dbf5931cc3" containerID="4a8a84684bed8d4e40666e945d53ca5a61b3dfc8972b3eb66b59e700a01915f0" exitCode=0 Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.978290 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d929-account-create-update-z2j9n" event={"ID":"20d998cd-6fc6-498b-9a65-18dbf5931cc3","Type":"ContainerDied","Data":"4a8a84684bed8d4e40666e945d53ca5a61b3dfc8972b3eb66b59e700a01915f0"} Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.992213 4911 generic.go:334] "Generic (PLEG): container finished" podID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerID="dd9670e57c5cef170cc3de9608ba90e388a6380241c3fe1a1378fdc2169ceb0b" exitCode=0 Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.992440 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4428f0a9-5fdc-4bdb-931e-da67fb9498c9","Type":"ContainerDied","Data":"dd9670e57c5cef170cc3de9608ba90e388a6380241c3fe1a1378fdc2169ceb0b"} Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.992610 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4428f0a9-5fdc-4bdb-931e-da67fb9498c9","Type":"ContainerDied","Data":"9e52857bca752217a2557471d292bc9df5ae5ebd923dc0237feb2ab9bdfbd7f8"} Mar 10 14:24:53 crc kubenswrapper[4911]: I0310 14:24:53.992676 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e52857bca752217a2557471d292bc9df5ae5ebd923dc0237feb2ab9bdfbd7f8" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.002625 4911 generic.go:334] "Generic (PLEG): container finished" podID="e0b02600-8a57-4944-9f93-cdf74f7dad84" containerID="051e07b5e5787eba2f639031817bf7c502ed899b72406e551a99d22edbd6ecc6" exitCode=0 Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.002740 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" event={"ID":"e0b02600-8a57-4944-9f93-cdf74f7dad84","Type":"ContainerDied","Data":"051e07b5e5787eba2f639031817bf7c502ed899b72406e551a99d22edbd6ecc6"} Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.006662 4911 generic.go:334] "Generic (PLEG): container finished" podID="34e0ca1a-a12e-45bd-8adc-3852606ec8a9" containerID="bcbc688113f8cb2ee00b0d735b3165ebf42dac0e7b4cddd1e0c84076a0695e3b" exitCode=0 Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.006744 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lwpcs" event={"ID":"34e0ca1a-a12e-45bd-8adc-3852606ec8a9","Type":"ContainerDied","Data":"bcbc688113f8cb2ee00b0d735b3165ebf42dac0e7b4cddd1e0c84076a0695e3b"} Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.015571 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerStarted","Data":"8123ad837045bba991cb2c2f47792d8e9c9e5051f2785c109eaea53052025a12"} Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.031703 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.063405 4911 generic.go:334] "Generic (PLEG): container finished" podID="41e63a1e-e526-498d-b14f-0720657e6c30" containerID="0f625dededc2b14e4a81dee42d61239a44f8452622e2682478cf067aa3a2b125" exitCode=0 Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.063526 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wdsg2" event={"ID":"41e63a1e-e526-498d-b14f-0720657e6c30","Type":"ContainerDied","Data":"0f625dededc2b14e4a81dee42d61239a44f8452622e2682478cf067aa3a2b125"} Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.080733 4911 scope.go:117] "RemoveContainer" containerID="09a287c0af9ac56378919e06cbb2868faa3a49db885c95daf22c889f9f356e32" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.091081 4911 generic.go:334] "Generic (PLEG): container finished" podID="2e53f4fb-9bff-46a8-a447-e19ee4777c3b" containerID="73f076ceb280a0cd17667cc0386ae710c3b52ce5e9251dc76fbc7f827913d7cd" exitCode=0 Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.091937 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.091972 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" event={"ID":"2e53f4fb-9bff-46a8-a447-e19ee4777c3b","Type":"ContainerDied","Data":"73f076ceb280a0cd17667cc0386ae710c3b52ce5e9251dc76fbc7f827913d7cd"} Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.096198 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.156783 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:24:54 crc kubenswrapper[4911]: E0310 14:24:54.163942 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerName="glance-httpd" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.163982 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerName="glance-httpd" Mar 10 14:24:54 crc kubenswrapper[4911]: E0310 14:24:54.164054 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.164064 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" Mar 10 14:24:54 crc kubenswrapper[4911]: E0310 14:24:54.164080 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon-log" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.164088 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon-log" Mar 10 14:24:54 crc kubenswrapper[4911]: E0310 14:24:54.164121 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerName="glance-log" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.164127 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerName="glance-log" Mar 10 14:24:54 crc kubenswrapper[4911]: E0310 14:24:54.164155 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerName="glance-log" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.164162 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerName="glance-log" Mar 10 14:24:54 crc kubenswrapper[4911]: E0310 14:24:54.164192 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerName="glance-httpd" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.164202 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerName="glance-httpd" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.165417 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerName="glance-log" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.165461 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerName="glance-httpd" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.165491 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" containerName="glance-httpd" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.165499 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon-log" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.165523 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" containerName="horizon" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.165538 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" containerName="glance-log" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.167426 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.173222 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-httpd-run\") pod \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.173291 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-internal-tls-certs\") pod \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.173326 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-logs\") pod \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.173469 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.173562 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-combined-ca-bundle\") pod \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.173634 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-config-data\") pod \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.174862 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lhcs\" (UniqueName: \"kubernetes.io/projected/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-kube-api-access-4lhcs\") pod \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.174942 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-scripts\") pod \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\" (UID: \"4428f0a9-5fdc-4bdb-931e-da67fb9498c9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.175558 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-logs" (OuterVolumeSpecName: "logs") pod "4428f0a9-5fdc-4bdb-931e-da67fb9498c9" (UID: "4428f0a9-5fdc-4bdb-931e-da67fb9498c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.175969 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4428f0a9-5fdc-4bdb-931e-da67fb9498c9" (UID: "4428f0a9-5fdc-4bdb-931e-da67fb9498c9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.180831 4911 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.180867 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.185487 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.185792 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.192085 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-scripts" (OuterVolumeSpecName: "scripts") pod "4428f0a9-5fdc-4bdb-931e-da67fb9498c9" (UID: "4428f0a9-5fdc-4bdb-931e-da67fb9498c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.211335 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-kube-api-access-4lhcs" (OuterVolumeSpecName: "kube-api-access-4lhcs") pod "4428f0a9-5fdc-4bdb-931e-da67fb9498c9" (UID: "4428f0a9-5fdc-4bdb-931e-da67fb9498c9"). InnerVolumeSpecName "kube-api-access-4lhcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.290895 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.291023 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/249c149f-3423-4163-8358-b36f6d55c6f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.291131 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249c149f-3423-4163-8358-b36f6d55c6f3-logs\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.291232 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.292514 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mssfk\" (UniqueName: \"kubernetes.io/projected/249c149f-3423-4163-8358-b36f6d55c6f3-kube-api-access-mssfk\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.292844 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.292922 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.293277 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.296372 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "4428f0a9-5fdc-4bdb-931e-da67fb9498c9" (UID: "4428f0a9-5fdc-4bdb-931e-da67fb9498c9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.296520 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lhcs\" (UniqueName: \"kubernetes.io/projected/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-kube-api-access-4lhcs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.297837 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.341609 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a546f2b5-3536-4608-b1f4-0127ebd52bfa" path="/var/lib/kubelet/pods/a546f2b5-3536-4608-b1f4-0127ebd52bfa/volumes" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.346627 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4428f0a9-5fdc-4bdb-931e-da67fb9498c9" (UID: "4428f0a9-5fdc-4bdb-931e-da67fb9498c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.347531 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58550bb-f4f6-40ef-9424-7bfd05d57c9d" path="/var/lib/kubelet/pods/b58550bb-f4f6-40ef-9424-7bfd05d57c9d/volumes" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.365683 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-config-data" (OuterVolumeSpecName: "config-data") pod "4428f0a9-5fdc-4bdb-931e-da67fb9498c9" (UID: "4428f0a9-5fdc-4bdb-931e-da67fb9498c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.384567 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.399995 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400060 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400081 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400158 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400185 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/249c149f-3423-4163-8358-b36f6d55c6f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400211 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249c149f-3423-4163-8358-b36f6d55c6f3-logs\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400245 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400281 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mssfk\" (UniqueName: \"kubernetes.io/projected/249c149f-3423-4163-8358-b36f6d55c6f3-kube-api-access-mssfk\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400376 4911 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400392 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.400403 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.403631 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249c149f-3423-4163-8358-b36f6d55c6f3-logs\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.405034 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/249c149f-3423-4163-8358-b36f6d55c6f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.406018 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.412105 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.426821 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.432400 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mssfk\" (UniqueName: \"kubernetes.io/projected/249c149f-3423-4163-8358-b36f6d55c6f3-kube-api-access-mssfk\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.434457 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.443003 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c149f-3423-4163-8358-b36f6d55c6f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.447294 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.463856 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"249c149f-3423-4163-8358-b36f6d55c6f3\") " pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.476392 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4428f0a9-5fdc-4bdb-931e-da67fb9498c9" (UID: "4428f0a9-5fdc-4bdb-931e-da67fb9498c9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.501823 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-operator-scripts\") pod \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\" (UID: \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.501946 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb5sj\" (UniqueName: \"kubernetes.io/projected/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-kube-api-access-zb5sj\") pod \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\" (UID: \"34e0ca1a-a12e-45bd-8adc-3852606ec8a9\") " Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.502353 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34e0ca1a-a12e-45bd-8adc-3852606ec8a9" (UID: "34e0ca1a-a12e-45bd-8adc-3852606ec8a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.507457 4911 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4428f0a9-5fdc-4bdb-931e-da67fb9498c9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.507500 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.510611 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-kube-api-access-zb5sj" (OuterVolumeSpecName: "kube-api-access-zb5sj") pod "34e0ca1a-a12e-45bd-8adc-3852606ec8a9" (UID: "34e0ca1a-a12e-45bd-8adc-3852606ec8a9"). InnerVolumeSpecName "kube-api-access-zb5sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.529662 4911 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.620208 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb5sj\" (UniqueName: \"kubernetes.io/projected/34e0ca1a-a12e-45bd-8adc-3852606ec8a9-kube-api-access-zb5sj\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.620255 4911 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.663743 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 14:24:54 crc kubenswrapper[4911]: I0310 14:24:54.990963 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.041546 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e63a1e-e526-498d-b14f-0720657e6c30-operator-scripts\") pod \"41e63a1e-e526-498d-b14f-0720657e6c30\" (UID: \"41e63a1e-e526-498d-b14f-0720657e6c30\") " Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.041705 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5m6\" (UniqueName: \"kubernetes.io/projected/41e63a1e-e526-498d-b14f-0720657e6c30-kube-api-access-vt5m6\") pod \"41e63a1e-e526-498d-b14f-0720657e6c30\" (UID: \"41e63a1e-e526-498d-b14f-0720657e6c30\") " Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.047039 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e63a1e-e526-498d-b14f-0720657e6c30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41e63a1e-e526-498d-b14f-0720657e6c30" (UID: "41e63a1e-e526-498d-b14f-0720657e6c30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.092377 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e63a1e-e526-498d-b14f-0720657e6c30-kube-api-access-vt5m6" (OuterVolumeSpecName: "kube-api-access-vt5m6") pod "41e63a1e-e526-498d-b14f-0720657e6c30" (UID: "41e63a1e-e526-498d-b14f-0720657e6c30"). InnerVolumeSpecName "kube-api-access-vt5m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.134562 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lwpcs" event={"ID":"34e0ca1a-a12e-45bd-8adc-3852606ec8a9","Type":"ContainerDied","Data":"d95578050253791c9183fbff0165f22bb005d81339c8e40793fca4955290f49a"} Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.134621 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d95578050253791c9183fbff0165f22bb005d81339c8e40793fca4955290f49a" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.134738 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lwpcs" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.147850 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5m6\" (UniqueName: \"kubernetes.io/projected/41e63a1e-e526-498d-b14f-0720657e6c30-kube-api-access-vt5m6\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.147885 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e63a1e-e526-498d-b14f-0720657e6c30-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.186585 4911 generic.go:334] "Generic (PLEG): container finished" podID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerID="95be14fb3901116d3d2f49bbf6cd4ceb984aa453ad533d64f331c09118428d98" exitCode=0 Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.186659 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf9dfcf44-sls8z" event={"ID":"4b96f0e8-39e3-4018-bb9e-75b83d333bd4","Type":"ContainerDied","Data":"95be14fb3901116d3d2f49bbf6cd4ceb984aa453ad533d64f331c09118428d98"} Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.188148 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wdsg2" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.189487 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wdsg2" event={"ID":"41e63a1e-e526-498d-b14f-0720657e6c30","Type":"ContainerDied","Data":"b28bb771feb60b8011a64ef12cebd387f14af156c11e52e5f3f4c29aa340fe66"} Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.189577 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b28bb771feb60b8011a64ef12cebd387f14af156c11e52e5f3f4c29aa340fe66" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.189675 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.539406 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.564770 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.574415 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.610245 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:24:55 crc kubenswrapper[4911]: E0310 14:24:55.615212 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b4b396-5ac9-490b-a59e-b568ca3cb638" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.615254 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b4b396-5ac9-490b-a59e-b568ca3cb638" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: E0310 14:24:55.615264 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e0ca1a-a12e-45bd-8adc-3852606ec8a9" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.615270 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e0ca1a-a12e-45bd-8adc-3852606ec8a9" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: E0310 14:24:55.615308 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e63a1e-e526-498d-b14f-0720657e6c30" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.615318 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e63a1e-e526-498d-b14f-0720657e6c30" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.615686 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e0ca1a-a12e-45bd-8adc-3852606ec8a9" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.615706 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e63a1e-e526-498d-b14f-0720657e6c30" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.615737 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b4b396-5ac9-490b-a59e-b568ca3cb638" containerName="mariadb-database-create" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.617350 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.629208 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.631407 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.659360 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.661605 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b4b396-5ac9-490b-a59e-b568ca3cb638-operator-scripts\") pod \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\" (UID: \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\") " Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.661776 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsnhb\" (UniqueName: \"kubernetes.io/projected/f9b4b396-5ac9-490b-a59e-b568ca3cb638-kube-api-access-nsnhb\") pod \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\" (UID: \"f9b4b396-5ac9-490b-a59e-b568ca3cb638\") " Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.662224 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/001c3353-3ca1-444c-a741-b2447e3ca566-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.662257 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.662309 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfglm\" (UniqueName: \"kubernetes.io/projected/001c3353-3ca1-444c-a741-b2447e3ca566-kube-api-access-tfglm\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.662356 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.662384 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/001c3353-3ca1-444c-a741-b2447e3ca566-logs\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.662425 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-config-data\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.662445 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-scripts\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.662493 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.678190 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b4b396-5ac9-490b-a59e-b568ca3cb638-kube-api-access-nsnhb" (OuterVolumeSpecName: "kube-api-access-nsnhb") pod "f9b4b396-5ac9-490b-a59e-b568ca3cb638" (UID: "f9b4b396-5ac9-490b-a59e-b568ca3cb638"). InnerVolumeSpecName "kube-api-access-nsnhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.712494 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b4b396-5ac9-490b-a59e-b568ca3cb638-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9b4b396-5ac9-490b-a59e-b568ca3cb638" (UID: "f9b4b396-5ac9-490b-a59e-b568ca3cb638"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765028 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/001c3353-3ca1-444c-a741-b2447e3ca566-logs\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765111 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-config-data\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765138 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-scripts\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765180 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765230 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/001c3353-3ca1-444c-a741-b2447e3ca566-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765257 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765292 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfglm\" (UniqueName: \"kubernetes.io/projected/001c3353-3ca1-444c-a741-b2447e3ca566-kube-api-access-tfglm\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765337 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765403 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsnhb\" (UniqueName: \"kubernetes.io/projected/f9b4b396-5ac9-490b-a59e-b568ca3cb638-kube-api-access-nsnhb\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.765417 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b4b396-5ac9-490b-a59e-b568ca3cb638-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.767599 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.768365 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/001c3353-3ca1-444c-a741-b2447e3ca566-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.768532 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/001c3353-3ca1-444c-a741-b2447e3ca566-logs\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.769203 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.770916 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.776818 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-scripts\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.777864 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.778244 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-config-data\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.788986 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001c3353-3ca1-444c-a741-b2447e3ca566-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.814152 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfglm\" (UniqueName: \"kubernetes.io/projected/001c3353-3ca1-444c-a741-b2447e3ca566-kube-api-access-tfglm\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.885941 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbvfq\" (UniqueName: \"kubernetes.io/projected/e0b02600-8a57-4944-9f93-cdf74f7dad84-kube-api-access-jbvfq\") pod \"e0b02600-8a57-4944-9f93-cdf74f7dad84\" (UID: \"e0b02600-8a57-4944-9f93-cdf74f7dad84\") " Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.886193 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0b02600-8a57-4944-9f93-cdf74f7dad84-operator-scripts\") pod \"e0b02600-8a57-4944-9f93-cdf74f7dad84\" (UID: \"e0b02600-8a57-4944-9f93-cdf74f7dad84\") " Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.900008 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b02600-8a57-4944-9f93-cdf74f7dad84-kube-api-access-jbvfq" (OuterVolumeSpecName: "kube-api-access-jbvfq") pod "e0b02600-8a57-4944-9f93-cdf74f7dad84" (UID: "e0b02600-8a57-4944-9f93-cdf74f7dad84"). InnerVolumeSpecName "kube-api-access-jbvfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.904340 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0b02600-8a57-4944-9f93-cdf74f7dad84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0b02600-8a57-4944-9f93-cdf74f7dad84" (UID: "e0b02600-8a57-4944-9f93-cdf74f7dad84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.927106 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"001c3353-3ca1-444c-a741-b2447e3ca566\") " pod="openstack/glance-default-internal-api-0" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.991905 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0b02600-8a57-4944-9f93-cdf74f7dad84-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:55 crc kubenswrapper[4911]: I0310 14:24:55.991956 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbvfq\" (UniqueName: \"kubernetes.io/projected/e0b02600-8a57-4944-9f93-cdf74f7dad84-kube-api-access-jbvfq\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.034807 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.101463 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.104680 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.119830 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.205487 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2zr5\" (UniqueName: \"kubernetes.io/projected/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-kube-api-access-p2zr5\") pod \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\" (UID: \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.205583 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-config\") pod \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.205670 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-operator-scripts\") pod \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\" (UID: \"2e53f4fb-9bff-46a8-a447-e19ee4777c3b\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.205714 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h85t2\" (UniqueName: \"kubernetes.io/projected/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-kube-api-access-h85t2\") pod \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.205812 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-httpd-config\") pod \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.205844 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhgfq\" (UniqueName: \"kubernetes.io/projected/20d998cd-6fc6-498b-9a65-18dbf5931cc3-kube-api-access-nhgfq\") pod \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\" (UID: \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.205955 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20d998cd-6fc6-498b-9a65-18dbf5931cc3-operator-scripts\") pod \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\" (UID: \"20d998cd-6fc6-498b-9a65-18dbf5931cc3\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.205999 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-combined-ca-bundle\") pod \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.206063 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-ovndb-tls-certs\") pod \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\" (UID: \"4b96f0e8-39e3-4018-bb9e-75b83d333bd4\") " Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.217239 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e53f4fb-9bff-46a8-a447-e19ee4777c3b" (UID: "2e53f4fb-9bff-46a8-a447-e19ee4777c3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.228045 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-kube-api-access-h85t2" (OuterVolumeSpecName: "kube-api-access-h85t2") pod "4b96f0e8-39e3-4018-bb9e-75b83d333bd4" (UID: "4b96f0e8-39e3-4018-bb9e-75b83d333bd4"). InnerVolumeSpecName "kube-api-access-h85t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.235747 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4b96f0e8-39e3-4018-bb9e-75b83d333bd4" (UID: "4b96f0e8-39e3-4018-bb9e-75b83d333bd4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.239540 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-kube-api-access-p2zr5" (OuterVolumeSpecName: "kube-api-access-p2zr5") pod "2e53f4fb-9bff-46a8-a447-e19ee4777c3b" (UID: "2e53f4fb-9bff-46a8-a447-e19ee4777c3b"). InnerVolumeSpecName "kube-api-access-p2zr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.252753 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d998cd-6fc6-498b-9a65-18dbf5931cc3-kube-api-access-nhgfq" (OuterVolumeSpecName: "kube-api-access-nhgfq") pod "20d998cd-6fc6-498b-9a65-18dbf5931cc3" (UID: "20d998cd-6fc6-498b-9a65-18dbf5931cc3"). InnerVolumeSpecName "kube-api-access-nhgfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.272300 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20d998cd-6fc6-498b-9a65-18dbf5931cc3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20d998cd-6fc6-498b-9a65-18dbf5931cc3" (UID: "20d998cd-6fc6-498b-9a65-18dbf5931cc3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.280161 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf9dfcf44-sls8z" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.319480 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.319519 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h85t2\" (UniqueName: \"kubernetes.io/projected/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-kube-api-access-h85t2\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.319539 4911 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.319549 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhgfq\" (UniqueName: \"kubernetes.io/projected/20d998cd-6fc6-498b-9a65-18dbf5931cc3-kube-api-access-nhgfq\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.319563 4911 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20d998cd-6fc6-498b-9a65-18dbf5931cc3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.319580 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2zr5\" (UniqueName: \"kubernetes.io/projected/2e53f4fb-9bff-46a8-a447-e19ee4777c3b-kube-api-access-p2zr5\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.349760 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d929-account-create-update-z2j9n" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.349837 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.349869 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.350574 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4428f0a9-5fdc-4bdb-931e-da67fb9498c9" path="/var/lib/kubelet/pods/4428f0a9-5fdc-4bdb-931e-da67fb9498c9/volumes" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.359658 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf9dfcf44-sls8z" event={"ID":"4b96f0e8-39e3-4018-bb9e-75b83d333bd4","Type":"ContainerDied","Data":"1b348239954ffbb80979fd166d9a271796e64ea393716e0634cf49c0a534e6e8"} Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.359986 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d929-account-create-update-z2j9n" event={"ID":"20d998cd-6fc6-498b-9a65-18dbf5931cc3","Type":"ContainerDied","Data":"008c08817339ac7a977833d987d475be659ccd58fff480c660ef153d0c53976e"} Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.360068 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="008c08817339ac7a977833d987d475be659ccd58fff480c660ef153d0c53976e" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.360127 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c94-account-create-update-9z2xg" event={"ID":"2e53f4fb-9bff-46a8-a447-e19ee4777c3b","Type":"ContainerDied","Data":"ac55d48c03b833a442de116450e728ee1d5d14eea60f923a13de218c42e64f42"} Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.360185 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac55d48c03b833a442de116450e728ee1d5d14eea60f923a13de218c42e64f42" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.360239 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c623-account-create-update-pzwxt" event={"ID":"e0b02600-8a57-4944-9f93-cdf74f7dad84","Type":"ContainerDied","Data":"4d51b2afa1d65ef6a953cdfba65f6c7eb7f4287af888283e1c62ba894f165fe6"} Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.360306 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d51b2afa1d65ef6a953cdfba65f6c7eb7f4287af888283e1c62ba894f165fe6" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.360361 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"249c149f-3423-4163-8358-b36f6d55c6f3","Type":"ContainerStarted","Data":"d44ab8172808455952f1641dad42ca5725cd3cd15f9b5bfe749c037fab487ba2"} Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.360440 4911 scope.go:117] "RemoveContainer" containerID="401715d98a839e18293dc3f8c3e947d7a73c27c3071b88d923203d15ef2cca0b" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.365180 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerStarted","Data":"e4479789c49e39d4220da2c528f4b43cc2334a898a40831c942635c4121a0ddd"} Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.375669 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m9ftl" event={"ID":"f9b4b396-5ac9-490b-a59e-b568ca3cb638","Type":"ContainerDied","Data":"3c919c76b04bb3c8057925673f278234357446bc53d0ef58f69f5b4d1deb7777"} Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.375739 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c919c76b04bb3c8057925673f278234357446bc53d0ef58f69f5b4d1deb7777" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.375828 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m9ftl" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.404995 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4b96f0e8-39e3-4018-bb9e-75b83d333bd4" (UID: "4b96f0e8-39e3-4018-bb9e-75b83d333bd4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.426917 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-config" (OuterVolumeSpecName: "config") pod "4b96f0e8-39e3-4018-bb9e-75b83d333bd4" (UID: "4b96f0e8-39e3-4018-bb9e-75b83d333bd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.428818 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.428851 4911 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.430551 4911 scope.go:117] "RemoveContainer" containerID="95be14fb3901116d3d2f49bbf6cd4ceb984aa453ad533d64f331c09118428d98" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.438948 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b96f0e8-39e3-4018-bb9e-75b83d333bd4" (UID: "4b96f0e8-39e3-4018-bb9e-75b83d333bd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.533867 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96f0e8-39e3-4018-bb9e-75b83d333bd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.698683 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cf9dfcf44-sls8z"] Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.722618 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cf9dfcf44-sls8z"] Mar 10 14:24:56 crc kubenswrapper[4911]: I0310 14:24:56.938398 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 14:24:56 crc kubenswrapper[4911]: W0310 14:24:56.946120 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c3353_3ca1_444c_a741_b2447e3ca566.slice/crio-943cc72f364664c4f42fa0c27fd581b95d85792a61afa58bde0b6744e9cbd315 WatchSource:0}: Error finding container 943cc72f364664c4f42fa0c27fd581b95d85792a61afa58bde0b6744e9cbd315: Status 404 returned error can't find the container with id 943cc72f364664c4f42fa0c27fd581b95d85792a61afa58bde0b6744e9cbd315 Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.400210 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerStarted","Data":"488172fb6142958849e20f65067a9cde03d802818dc096baa2de6b30a99d265f"} Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.400441 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.401423 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="sg-core" containerID="cri-o://e4479789c49e39d4220da2c528f4b43cc2334a898a40831c942635c4121a0ddd" gracePeriod=30 Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.401575 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="proxy-httpd" containerID="cri-o://488172fb6142958849e20f65067a9cde03d802818dc096baa2de6b30a99d265f" gracePeriod=30 Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.401675 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="ceilometer-notification-agent" containerID="cri-o://8123ad837045bba991cb2c2f47792d8e9c9e5051f2785c109eaea53052025a12" gracePeriod=30 Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.401778 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="ceilometer-central-agent" containerID="cri-o://9a029a5175990d2a360f99ceee30bf164bee47c919135c6a59d41b29117e09d8" gracePeriod=30 Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.414244 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"001c3353-3ca1-444c-a741-b2447e3ca566","Type":"ContainerStarted","Data":"943cc72f364664c4f42fa0c27fd581b95d85792a61afa58bde0b6744e9cbd315"} Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.433670 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"249c149f-3423-4163-8358-b36f6d55c6f3","Type":"ContainerStarted","Data":"6208f06813981e57dee2c4e0ba0f27741ef35915aa2c49b632c3e036cceaeaa6"} Mar 10 14:24:57 crc kubenswrapper[4911]: I0310 14:24:57.445213 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=10.069773325 podStartE2EDuration="15.445180728s" podCreationTimestamp="2026-03-10 14:24:42 +0000 UTC" firstStartedPulling="2026-03-10 14:24:51.587842946 +0000 UTC m=+1396.151362863" lastFinishedPulling="2026-03-10 14:24:56.963250349 +0000 UTC m=+1401.526770266" observedRunningTime="2026-03-10 14:24:57.433515154 +0000 UTC m=+1401.997035091" watchObservedRunningTime="2026-03-10 14:24:57.445180728 +0000 UTC m=+1402.008700645" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.072126 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ls7n2"] Mar 10 14:24:58 crc kubenswrapper[4911]: E0310 14:24:58.073273 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerName="neutron-api" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.073345 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerName="neutron-api" Mar 10 14:24:58 crc kubenswrapper[4911]: E0310 14:24:58.073445 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerName="neutron-httpd" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.073508 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerName="neutron-httpd" Mar 10 14:24:58 crc kubenswrapper[4911]: E0310 14:24:58.073706 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d998cd-6fc6-498b-9a65-18dbf5931cc3" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.073789 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d998cd-6fc6-498b-9a65-18dbf5931cc3" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: E0310 14:24:58.073875 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b02600-8a57-4944-9f93-cdf74f7dad84" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.073954 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b02600-8a57-4944-9f93-cdf74f7dad84" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: E0310 14:24:58.074010 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e53f4fb-9bff-46a8-a447-e19ee4777c3b" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.074075 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e53f4fb-9bff-46a8-a447-e19ee4777c3b" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.076103 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="20d998cd-6fc6-498b-9a65-18dbf5931cc3" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.076309 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerName="neutron-api" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.076376 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e53f4fb-9bff-46a8-a447-e19ee4777c3b" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.076441 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b02600-8a57-4944-9f93-cdf74f7dad84" containerName="mariadb-account-create-update" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.076523 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" containerName="neutron-httpd" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.089075 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.103690 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.104642 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.104963 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-69dl5" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.233293 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.233460 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-scripts\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.233521 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-config-data\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.233545 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdbb2\" (UniqueName: \"kubernetes.io/projected/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-kube-api-access-vdbb2\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.248805 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b96f0e8-39e3-4018-bb9e-75b83d333bd4" path="/var/lib/kubelet/pods/4b96f0e8-39e3-4018-bb9e-75b83d333bd4/volumes" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.249772 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ls7n2"] Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.336999 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.337159 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-scripts\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.337206 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-config-data\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.337247 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdbb2\" (UniqueName: \"kubernetes.io/projected/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-kube-api-access-vdbb2\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.343486 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-config-data\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.345454 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.349231 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-scripts\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.361573 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdbb2\" (UniqueName: \"kubernetes.io/projected/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-kube-api-access-vdbb2\") pod \"nova-cell0-conductor-db-sync-ls7n2\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.436461 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.450499 4911 generic.go:334] "Generic (PLEG): container finished" podID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerID="488172fb6142958849e20f65067a9cde03d802818dc096baa2de6b30a99d265f" exitCode=0 Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.450540 4911 generic.go:334] "Generic (PLEG): container finished" podID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerID="e4479789c49e39d4220da2c528f4b43cc2334a898a40831c942635c4121a0ddd" exitCode=2 Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.450552 4911 generic.go:334] "Generic (PLEG): container finished" podID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerID="8123ad837045bba991cb2c2f47792d8e9c9e5051f2785c109eaea53052025a12" exitCode=0 Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.450613 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerDied","Data":"488172fb6142958849e20f65067a9cde03d802818dc096baa2de6b30a99d265f"} Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.450647 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerDied","Data":"e4479789c49e39d4220da2c528f4b43cc2334a898a40831c942635c4121a0ddd"} Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.450658 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerDied","Data":"8123ad837045bba991cb2c2f47792d8e9c9e5051f2785c109eaea53052025a12"} Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.453977 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"001c3353-3ca1-444c-a741-b2447e3ca566","Type":"ContainerStarted","Data":"b260c6b9b0f7c8238f98b249ecc9767e7e67cc7e67364c4edf97184b51eaee22"} Mar 10 14:24:58 crc kubenswrapper[4911]: I0310 14:24:58.457676 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"249c149f-3423-4163-8358-b36f6d55c6f3","Type":"ContainerStarted","Data":"33165622a37945f88584c555a99d4e64b68ee2b223de61487cbf0405ff50e14e"} Mar 10 14:24:59 crc kubenswrapper[4911]: I0310 14:24:59.150852 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.1508254749999995 podStartE2EDuration="5.150825475s" podCreationTimestamp="2026-03-10 14:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:58.492018214 +0000 UTC m=+1403.055538131" watchObservedRunningTime="2026-03-10 14:24:59.150825475 +0000 UTC m=+1403.714345412" Mar 10 14:24:59 crc kubenswrapper[4911]: I0310 14:24:59.167376 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ls7n2"] Mar 10 14:24:59 crc kubenswrapper[4911]: W0310 14:24:59.172075 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod022d43c3_2aa6_4720_9bc2_c79662f9ec3c.slice/crio-0afa578dbea57c02effa6b9ef9643ffd3e17bdf9f295e39448fcc1d9590c776e WatchSource:0}: Error finding container 0afa578dbea57c02effa6b9ef9643ffd3e17bdf9f295e39448fcc1d9590c776e: Status 404 returned error can't find the container with id 0afa578dbea57c02effa6b9ef9643ffd3e17bdf9f295e39448fcc1d9590c776e Mar 10 14:24:59 crc kubenswrapper[4911]: I0310 14:24:59.471448 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"001c3353-3ca1-444c-a741-b2447e3ca566","Type":"ContainerStarted","Data":"f9d2cace9104660962c6ac5eea49d57072f6ee35c417c1bda695a77db4f9adf6"} Mar 10 14:24:59 crc kubenswrapper[4911]: I0310 14:24:59.472461 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" event={"ID":"022d43c3-2aa6-4720-9bc2-c79662f9ec3c","Type":"ContainerStarted","Data":"0afa578dbea57c02effa6b9ef9643ffd3e17bdf9f295e39448fcc1d9590c776e"} Mar 10 14:24:59 crc kubenswrapper[4911]: I0310 14:24:59.510670 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.510649095 podStartE2EDuration="4.510649095s" podCreationTimestamp="2026-03-10 14:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:24:59.508490608 +0000 UTC m=+1404.072010555" watchObservedRunningTime="2026-03-10 14:24:59.510649095 +0000 UTC m=+1404.074169012" Mar 10 14:25:03 crc kubenswrapper[4911]: I0310 14:25:03.530145 4911 generic.go:334] "Generic (PLEG): container finished" podID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerID="9a029a5175990d2a360f99ceee30bf164bee47c919135c6a59d41b29117e09d8" exitCode=0 Mar 10 14:25:03 crc kubenswrapper[4911]: I0310 14:25:03.530243 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerDied","Data":"9a029a5175990d2a360f99ceee30bf164bee47c919135c6a59d41b29117e09d8"} Mar 10 14:25:04 crc kubenswrapper[4911]: I0310 14:25:04.664207 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 14:25:04 crc kubenswrapper[4911]: I0310 14:25:04.664288 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 14:25:04 crc kubenswrapper[4911]: I0310 14:25:04.715460 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 14:25:04 crc kubenswrapper[4911]: I0310 14:25:04.722934 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 14:25:05 crc kubenswrapper[4911]: I0310 14:25:05.553626 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 14:25:05 crc kubenswrapper[4911]: I0310 14:25:05.553706 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 14:25:06 crc kubenswrapper[4911]: I0310 14:25:06.037440 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 14:25:06 crc kubenswrapper[4911]: I0310 14:25:06.037516 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 14:25:06 crc kubenswrapper[4911]: I0310 14:25:06.083657 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 14:25:06 crc kubenswrapper[4911]: I0310 14:25:06.105455 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 14:25:06 crc kubenswrapper[4911]: I0310 14:25:06.568536 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 14:25:06 crc kubenswrapper[4911]: I0310 14:25:06.569191 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 14:25:08 crc kubenswrapper[4911]: I0310 14:25:08.015195 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 14:25:08 crc kubenswrapper[4911]: I0310 14:25:08.015323 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:25:08 crc kubenswrapper[4911]: I0310 14:25:08.313929 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 14:25:08 crc kubenswrapper[4911]: I0310 14:25:08.586212 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:25:08 crc kubenswrapper[4911]: I0310 14:25:08.586249 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 14:25:08 crc kubenswrapper[4911]: I0310 14:25:08.823871 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 14:25:08 crc kubenswrapper[4911]: I0310 14:25:08.825438 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.732540 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.821296 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-combined-ca-bundle\") pod \"16236160-b958-4733-a1f7-b3bfe8aeac93\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.821349 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-log-httpd\") pod \"16236160-b958-4733-a1f7-b3bfe8aeac93\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.822358 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "16236160-b958-4733-a1f7-b3bfe8aeac93" (UID: "16236160-b958-4733-a1f7-b3bfe8aeac93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.822412 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-run-httpd\") pod \"16236160-b958-4733-a1f7-b3bfe8aeac93\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.822469 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-sg-core-conf-yaml\") pod \"16236160-b958-4733-a1f7-b3bfe8aeac93\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.822655 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-scripts\") pod \"16236160-b958-4733-a1f7-b3bfe8aeac93\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.822751 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-config-data\") pod \"16236160-b958-4733-a1f7-b3bfe8aeac93\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.823359 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp5w2\" (UniqueName: \"kubernetes.io/projected/16236160-b958-4733-a1f7-b3bfe8aeac93-kube-api-access-lp5w2\") pod \"16236160-b958-4733-a1f7-b3bfe8aeac93\" (UID: \"16236160-b958-4733-a1f7-b3bfe8aeac93\") " Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.823867 4911 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.822662 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "16236160-b958-4733-a1f7-b3bfe8aeac93" (UID: "16236160-b958-4733-a1f7-b3bfe8aeac93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.829263 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-scripts" (OuterVolumeSpecName: "scripts") pod "16236160-b958-4733-a1f7-b3bfe8aeac93" (UID: "16236160-b958-4733-a1f7-b3bfe8aeac93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.834010 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16236160-b958-4733-a1f7-b3bfe8aeac93-kube-api-access-lp5w2" (OuterVolumeSpecName: "kube-api-access-lp5w2") pod "16236160-b958-4733-a1f7-b3bfe8aeac93" (UID: "16236160-b958-4733-a1f7-b3bfe8aeac93"). InnerVolumeSpecName "kube-api-access-lp5w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.858363 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "16236160-b958-4733-a1f7-b3bfe8aeac93" (UID: "16236160-b958-4733-a1f7-b3bfe8aeac93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.907074 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16236160-b958-4733-a1f7-b3bfe8aeac93" (UID: "16236160-b958-4733-a1f7-b3bfe8aeac93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.926424 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp5w2\" (UniqueName: \"kubernetes.io/projected/16236160-b958-4733-a1f7-b3bfe8aeac93-kube-api-access-lp5w2\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.926477 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.926490 4911 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16236160-b958-4733-a1f7-b3bfe8aeac93-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.926502 4911 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.926516 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:09 crc kubenswrapper[4911]: I0310 14:25:09.969893 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-config-data" (OuterVolumeSpecName: "config-data") pod "16236160-b958-4733-a1f7-b3bfe8aeac93" (UID: "16236160-b958-4733-a1f7-b3bfe8aeac93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.028512 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16236160-b958-4733-a1f7-b3bfe8aeac93-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.663516 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" event={"ID":"022d43c3-2aa6-4720-9bc2-c79662f9ec3c","Type":"ContainerStarted","Data":"07eb6bb91dfc269f60ea6291d5447ac5382f38d6404f50e03dd2d82adf7063f9"} Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.696408 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" podStartSLOduration=2.4983723270000002 podStartE2EDuration="12.696385671s" podCreationTimestamp="2026-03-10 14:24:58 +0000 UTC" firstStartedPulling="2026-03-10 14:24:59.174621436 +0000 UTC m=+1403.738141363" lastFinishedPulling="2026-03-10 14:25:09.37263479 +0000 UTC m=+1413.936154707" observedRunningTime="2026-03-10 14:25:10.687036167 +0000 UTC m=+1415.250556104" watchObservedRunningTime="2026-03-10 14:25:10.696385671 +0000 UTC m=+1415.259905588" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.698528 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.699301 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16236160-b958-4733-a1f7-b3bfe8aeac93","Type":"ContainerDied","Data":"24f67465cd59d7b6929045b85ae26632cae0cb19ca51aa294c4d2eefd1c4030e"} Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.699370 4911 scope.go:117] "RemoveContainer" containerID="488172fb6142958849e20f65067a9cde03d802818dc096baa2de6b30a99d265f" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.736055 4911 scope.go:117] "RemoveContainer" containerID="e4479789c49e39d4220da2c528f4b43cc2334a898a40831c942635c4121a0ddd" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.755815 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.771784 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780023 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:10 crc kubenswrapper[4911]: E0310 14:25:10.780572 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="proxy-httpd" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780597 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="proxy-httpd" Mar 10 14:25:10 crc kubenswrapper[4911]: E0310 14:25:10.780630 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="ceilometer-central-agent" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780638 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="ceilometer-central-agent" Mar 10 14:25:10 crc kubenswrapper[4911]: E0310 14:25:10.780646 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="sg-core" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780653 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="sg-core" Mar 10 14:25:10 crc kubenswrapper[4911]: E0310 14:25:10.780670 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="ceilometer-notification-agent" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780677 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="ceilometer-notification-agent" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780909 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="proxy-httpd" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780939 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="ceilometer-central-agent" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780948 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="sg-core" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.780958 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" containerName="ceilometer-notification-agent" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.783572 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.787176 4911 scope.go:117] "RemoveContainer" containerID="8123ad837045bba991cb2c2f47792d8e9c9e5051f2785c109eaea53052025a12" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.787629 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.788278 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.793928 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.835321 4911 scope.go:117] "RemoveContainer" containerID="9a029a5175990d2a360f99ceee30bf164bee47c919135c6a59d41b29117e09d8" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.954434 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-run-httpd\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.954513 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.954545 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77nql\" (UniqueName: \"kubernetes.io/projected/d190ccc8-5a25-4362-9b04-b1381bdff384-kube-api-access-77nql\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.954665 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.954734 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-log-httpd\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.954762 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-config-data\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:10 crc kubenswrapper[4911]: I0310 14:25:10.954957 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-scripts\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.057502 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-run-httpd\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.057594 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.057627 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77nql\" (UniqueName: \"kubernetes.io/projected/d190ccc8-5a25-4362-9b04-b1381bdff384-kube-api-access-77nql\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.057739 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.057793 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-log-httpd\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.057832 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-config-data\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.057871 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-scripts\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.059091 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-run-httpd\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.059319 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-log-httpd\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.064308 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.067385 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.081543 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-scripts\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.095642 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77nql\" (UniqueName: \"kubernetes.io/projected/d190ccc8-5a25-4362-9b04-b1381bdff384-kube-api-access-77nql\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.112838 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-config-data\") pod \"ceilometer-0\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.130283 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.669140 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:11 crc kubenswrapper[4911]: W0310 14:25:11.676549 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd190ccc8_5a25_4362_9b04_b1381bdff384.slice/crio-2d557d7cb5cb25d3accdd4a9bead3f4fc369770fcceea9a7e52c4eaf88539bf4 WatchSource:0}: Error finding container 2d557d7cb5cb25d3accdd4a9bead3f4fc369770fcceea9a7e52c4eaf88539bf4: Status 404 returned error can't find the container with id 2d557d7cb5cb25d3accdd4a9bead3f4fc369770fcceea9a7e52c4eaf88539bf4 Mar 10 14:25:11 crc kubenswrapper[4911]: I0310 14:25:11.715022 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerStarted","Data":"2d557d7cb5cb25d3accdd4a9bead3f4fc369770fcceea9a7e52c4eaf88539bf4"} Mar 10 14:25:12 crc kubenswrapper[4911]: I0310 14:25:12.206188 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16236160-b958-4733-a1f7-b3bfe8aeac93" path="/var/lib/kubelet/pods/16236160-b958-4733-a1f7-b3bfe8aeac93/volumes" Mar 10 14:25:12 crc kubenswrapper[4911]: I0310 14:25:12.726407 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerStarted","Data":"932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece"} Mar 10 14:25:13 crc kubenswrapper[4911]: I0310 14:25:13.740193 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerStarted","Data":"d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c"} Mar 10 14:25:14 crc kubenswrapper[4911]: I0310 14:25:14.758056 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerStarted","Data":"e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117"} Mar 10 14:25:16 crc kubenswrapper[4911]: I0310 14:25:16.782592 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerStarted","Data":"750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757"} Mar 10 14:25:16 crc kubenswrapper[4911]: I0310 14:25:16.783296 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 14:25:16 crc kubenswrapper[4911]: I0310 14:25:16.815604 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.972821048 podStartE2EDuration="6.815573603s" podCreationTimestamp="2026-03-10 14:25:10 +0000 UTC" firstStartedPulling="2026-03-10 14:25:11.682201234 +0000 UTC m=+1416.245721141" lastFinishedPulling="2026-03-10 14:25:15.524953779 +0000 UTC m=+1420.088473696" observedRunningTime="2026-03-10 14:25:16.805780547 +0000 UTC m=+1421.369300484" watchObservedRunningTime="2026-03-10 14:25:16.815573603 +0000 UTC m=+1421.379093520" Mar 10 14:25:18 crc kubenswrapper[4911]: I0310 14:25:18.520890 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:25:18 crc kubenswrapper[4911]: I0310 14:25:18.520964 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:25:18 crc kubenswrapper[4911]: I0310 14:25:18.521325 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:25:18 crc kubenswrapper[4911]: I0310 14:25:18.522229 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19d28b4207c776d043f5f0d2450f7371800625af7e9dbf7c4bc17586e1f99a7f"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:25:18 crc kubenswrapper[4911]: I0310 14:25:18.522614 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://19d28b4207c776d043f5f0d2450f7371800625af7e9dbf7c4bc17586e1f99a7f" gracePeriod=600 Mar 10 14:25:18 crc kubenswrapper[4911]: I0310 14:25:18.807514 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="19d28b4207c776d043f5f0d2450f7371800625af7e9dbf7c4bc17586e1f99a7f" exitCode=0 Mar 10 14:25:18 crc kubenswrapper[4911]: I0310 14:25:18.807579 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"19d28b4207c776d043f5f0d2450f7371800625af7e9dbf7c4bc17586e1f99a7f"} Mar 10 14:25:18 crc kubenswrapper[4911]: I0310 14:25:18.807646 4911 scope.go:117] "RemoveContainer" containerID="b451ff8e7fd4c75aa2b34c26affcab379b47b137f2e280e6643a4a5092850d94" Mar 10 14:25:19 crc kubenswrapper[4911]: I0310 14:25:19.822512 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c"} Mar 10 14:25:24 crc kubenswrapper[4911]: I0310 14:25:24.868138 4911 generic.go:334] "Generic (PLEG): container finished" podID="022d43c3-2aa6-4720-9bc2-c79662f9ec3c" containerID="07eb6bb91dfc269f60ea6291d5447ac5382f38d6404f50e03dd2d82adf7063f9" exitCode=0 Mar 10 14:25:24 crc kubenswrapper[4911]: I0310 14:25:24.868364 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" event={"ID":"022d43c3-2aa6-4720-9bc2-c79662f9ec3c","Type":"ContainerDied","Data":"07eb6bb91dfc269f60ea6291d5447ac5382f38d6404f50e03dd2d82adf7063f9"} Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.269241 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.413596 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-scripts\") pod \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.413832 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-combined-ca-bundle\") pod \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.413880 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-config-data\") pod \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.414172 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdbb2\" (UniqueName: \"kubernetes.io/projected/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-kube-api-access-vdbb2\") pod \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\" (UID: \"022d43c3-2aa6-4720-9bc2-c79662f9ec3c\") " Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.421224 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-kube-api-access-vdbb2" (OuterVolumeSpecName: "kube-api-access-vdbb2") pod "022d43c3-2aa6-4720-9bc2-c79662f9ec3c" (UID: "022d43c3-2aa6-4720-9bc2-c79662f9ec3c"). InnerVolumeSpecName "kube-api-access-vdbb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.424409 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-scripts" (OuterVolumeSpecName: "scripts") pod "022d43c3-2aa6-4720-9bc2-c79662f9ec3c" (UID: "022d43c3-2aa6-4720-9bc2-c79662f9ec3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.446165 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-config-data" (OuterVolumeSpecName: "config-data") pod "022d43c3-2aa6-4720-9bc2-c79662f9ec3c" (UID: "022d43c3-2aa6-4720-9bc2-c79662f9ec3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.449224 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "022d43c3-2aa6-4720-9bc2-c79662f9ec3c" (UID: "022d43c3-2aa6-4720-9bc2-c79662f9ec3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.516995 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.517029 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.517040 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdbb2\" (UniqueName: \"kubernetes.io/projected/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-kube-api-access-vdbb2\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.517050 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022d43c3-2aa6-4720-9bc2-c79662f9ec3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.888778 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" event={"ID":"022d43c3-2aa6-4720-9bc2-c79662f9ec3c","Type":"ContainerDied","Data":"0afa578dbea57c02effa6b9ef9643ffd3e17bdf9f295e39448fcc1d9590c776e"} Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.888831 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0afa578dbea57c02effa6b9ef9643ffd3e17bdf9f295e39448fcc1d9590c776e" Mar 10 14:25:26 crc kubenswrapper[4911]: I0310 14:25:26.888940 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ls7n2" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.083466 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:25:27 crc kubenswrapper[4911]: E0310 14:25:27.087279 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="022d43c3-2aa6-4720-9bc2-c79662f9ec3c" containerName="nova-cell0-conductor-db-sync" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.087313 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="022d43c3-2aa6-4720-9bc2-c79662f9ec3c" containerName="nova-cell0-conductor-db-sync" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.087571 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="022d43c3-2aa6-4720-9bc2-c79662f9ec3c" containerName="nova-cell0-conductor-db-sync" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.088448 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.094479 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-69dl5" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.094905 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.107057 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.233138 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.234332 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtws5\" (UniqueName: \"kubernetes.io/projected/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-kube-api-access-qtws5\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.234550 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.336680 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.337602 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.337947 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtws5\" (UniqueName: \"kubernetes.io/projected/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-kube-api-access-qtws5\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.341783 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.343168 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.363336 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtws5\" (UniqueName: \"kubernetes.io/projected/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-kube-api-access-qtws5\") pod \"nova-cell0-conductor-0\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.410776 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:27 crc kubenswrapper[4911]: I0310 14:25:27.925404 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:25:27 crc kubenswrapper[4911]: W0310 14:25:27.927172 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb3a0241_0c0a_43c8_b76f_4c7a91c4edcc.slice/crio-9da7d23f9ee4bee12c522054ab4672c0ff0861aa72ffe29d0a18b42dd5bf149c WatchSource:0}: Error finding container 9da7d23f9ee4bee12c522054ab4672c0ff0861aa72ffe29d0a18b42dd5bf149c: Status 404 returned error can't find the container with id 9da7d23f9ee4bee12c522054ab4672c0ff0861aa72ffe29d0a18b42dd5bf149c Mar 10 14:25:28 crc kubenswrapper[4911]: I0310 14:25:28.577095 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:25:28 crc kubenswrapper[4911]: I0310 14:25:28.909970 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc","Type":"ContainerStarted","Data":"397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3"} Mar 10 14:25:28 crc kubenswrapper[4911]: I0310 14:25:28.910297 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc","Type":"ContainerStarted","Data":"9da7d23f9ee4bee12c522054ab4672c0ff0861aa72ffe29d0a18b42dd5bf149c"} Mar 10 14:25:28 crc kubenswrapper[4911]: I0310 14:25:28.910605 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 14:25:28 crc kubenswrapper[4911]: I0310 14:25:28.929686 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.929663844 podStartE2EDuration="1.929663844s" podCreationTimestamp="2026-03-10 14:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:25:28.927711722 +0000 UTC m=+1433.491231649" watchObservedRunningTime="2026-03-10 14:25:28.929663844 +0000 UTC m=+1433.493183761" Mar 10 14:25:29 crc kubenswrapper[4911]: I0310 14:25:29.918115 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" containerID="cri-o://397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" gracePeriod=30 Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.352194 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.352814 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="ceilometer-central-agent" containerID="cri-o://932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece" gracePeriod=30 Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.352901 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="proxy-httpd" containerID="cri-o://750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757" gracePeriod=30 Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.352963 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="ceilometer-notification-agent" containerID="cri-o://d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c" gracePeriod=30 Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.353095 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="sg-core" containerID="cri-o://e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117" gracePeriod=30 Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.362196 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.188:3000/\": EOF" Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.946983 4911 generic.go:334] "Generic (PLEG): container finished" podID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerID="750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757" exitCode=0 Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.948353 4911 generic.go:334] "Generic (PLEG): container finished" podID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerID="e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117" exitCode=2 Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.948441 4911 generic.go:334] "Generic (PLEG): container finished" podID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerID="932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece" exitCode=0 Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.947941 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerDied","Data":"750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757"} Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.948821 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerDied","Data":"e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117"} Mar 10 14:25:30 crc kubenswrapper[4911]: I0310 14:25:30.948924 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerDied","Data":"932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece"} Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.323064 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.437395 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-sg-core-conf-yaml\") pod \"d190ccc8-5a25-4362-9b04-b1381bdff384\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.437484 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-combined-ca-bundle\") pod \"d190ccc8-5a25-4362-9b04-b1381bdff384\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.437625 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77nql\" (UniqueName: \"kubernetes.io/projected/d190ccc8-5a25-4362-9b04-b1381bdff384-kube-api-access-77nql\") pod \"d190ccc8-5a25-4362-9b04-b1381bdff384\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.437663 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-config-data\") pod \"d190ccc8-5a25-4362-9b04-b1381bdff384\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.437842 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-log-httpd\") pod \"d190ccc8-5a25-4362-9b04-b1381bdff384\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.437874 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-run-httpd\") pod \"d190ccc8-5a25-4362-9b04-b1381bdff384\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.437897 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-scripts\") pod \"d190ccc8-5a25-4362-9b04-b1381bdff384\" (UID: \"d190ccc8-5a25-4362-9b04-b1381bdff384\") " Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.438843 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d190ccc8-5a25-4362-9b04-b1381bdff384" (UID: "d190ccc8-5a25-4362-9b04-b1381bdff384"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.439058 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d190ccc8-5a25-4362-9b04-b1381bdff384" (UID: "d190ccc8-5a25-4362-9b04-b1381bdff384"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.444796 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d190ccc8-5a25-4362-9b04-b1381bdff384-kube-api-access-77nql" (OuterVolumeSpecName: "kube-api-access-77nql") pod "d190ccc8-5a25-4362-9b04-b1381bdff384" (UID: "d190ccc8-5a25-4362-9b04-b1381bdff384"). InnerVolumeSpecName "kube-api-access-77nql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.446817 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-scripts" (OuterVolumeSpecName: "scripts") pod "d190ccc8-5a25-4362-9b04-b1381bdff384" (UID: "d190ccc8-5a25-4362-9b04-b1381bdff384"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.475674 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d190ccc8-5a25-4362-9b04-b1381bdff384" (UID: "d190ccc8-5a25-4362-9b04-b1381bdff384"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.540497 4911 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.540539 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77nql\" (UniqueName: \"kubernetes.io/projected/d190ccc8-5a25-4362-9b04-b1381bdff384-kube-api-access-77nql\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.540556 4911 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.540567 4911 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d190ccc8-5a25-4362-9b04-b1381bdff384-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.540578 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.555713 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d190ccc8-5a25-4362-9b04-b1381bdff384" (UID: "d190ccc8-5a25-4362-9b04-b1381bdff384"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.558411 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-config-data" (OuterVolumeSpecName: "config-data") pod "d190ccc8-5a25-4362-9b04-b1381bdff384" (UID: "d190ccc8-5a25-4362-9b04-b1381bdff384"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.642825 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.642892 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d190ccc8-5a25-4362-9b04-b1381bdff384-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.965121 4911 generic.go:334] "Generic (PLEG): container finished" podID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerID="d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c" exitCode=0 Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.965277 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.965314 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerDied","Data":"d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c"} Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.966234 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d190ccc8-5a25-4362-9b04-b1381bdff384","Type":"ContainerDied","Data":"2d557d7cb5cb25d3accdd4a9bead3f4fc369770fcceea9a7e52c4eaf88539bf4"} Mar 10 14:25:31 crc kubenswrapper[4911]: I0310 14:25:31.966282 4911 scope.go:117] "RemoveContainer" containerID="750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:31.997586 4911 scope.go:117] "RemoveContainer" containerID="e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.031454 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.046817 4911 scope.go:117] "RemoveContainer" containerID="d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.055822 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.073272 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:32 crc kubenswrapper[4911]: E0310 14:25:32.073777 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="ceilometer-notification-agent" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.073801 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="ceilometer-notification-agent" Mar 10 14:25:32 crc kubenswrapper[4911]: E0310 14:25:32.073827 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="proxy-httpd" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.073836 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="proxy-httpd" Mar 10 14:25:32 crc kubenswrapper[4911]: E0310 14:25:32.073851 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="sg-core" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.073857 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="sg-core" Mar 10 14:25:32 crc kubenswrapper[4911]: E0310 14:25:32.073873 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="ceilometer-central-agent" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.073879 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="ceilometer-central-agent" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.074062 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="proxy-httpd" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.074078 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="ceilometer-notification-agent" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.074087 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="ceilometer-central-agent" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.074105 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" containerName="sg-core" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.076078 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.078376 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.082825 4911 scope.go:117] "RemoveContainer" containerID="932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.104042 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.105204 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.140196 4911 scope.go:117] "RemoveContainer" containerID="750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757" Mar 10 14:25:32 crc kubenswrapper[4911]: E0310 14:25:32.141404 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757\": container with ID starting with 750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757 not found: ID does not exist" containerID="750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.141503 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757"} err="failed to get container status \"750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757\": rpc error: code = NotFound desc = could not find container \"750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757\": container with ID starting with 750179cab32f17a2132a764504c0571c99fc50066f8bda2d02dd1cec5277f757 not found: ID does not exist" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.141542 4911 scope.go:117] "RemoveContainer" containerID="e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117" Mar 10 14:25:32 crc kubenswrapper[4911]: E0310 14:25:32.142040 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117\": container with ID starting with e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117 not found: ID does not exist" containerID="e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.142147 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117"} err="failed to get container status \"e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117\": rpc error: code = NotFound desc = could not find container \"e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117\": container with ID starting with e65e9f6b1b58775c5e4bb88676ce2d4d81022b856448e01e804daa049ee36117 not found: ID does not exist" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.142274 4911 scope.go:117] "RemoveContainer" containerID="d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c" Mar 10 14:25:32 crc kubenswrapper[4911]: E0310 14:25:32.142717 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c\": container with ID starting with d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c not found: ID does not exist" containerID="d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.142793 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c"} err="failed to get container status \"d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c\": rpc error: code = NotFound desc = could not find container \"d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c\": container with ID starting with d3bdad38630212681d918a05cd60962400713b1358c402ee0e5a708ef9c2ed6c not found: ID does not exist" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.142835 4911 scope.go:117] "RemoveContainer" containerID="932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece" Mar 10 14:25:32 crc kubenswrapper[4911]: E0310 14:25:32.143241 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece\": container with ID starting with 932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece not found: ID does not exist" containerID="932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.143283 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece"} err="failed to get container status \"932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece\": rpc error: code = NotFound desc = could not find container \"932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece\": container with ID starting with 932a4b503397de1f95ece49657f4bf34adcc4572c17db17f335d2e6f56193ece not found: ID does not exist" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.152110 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-log-httpd\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.152201 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-config-data\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.152237 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.152311 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.152331 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-scripts\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.152356 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w575w\" (UniqueName: \"kubernetes.io/projected/3d56827f-8d2b-4f1e-a623-36b865f282d6-kube-api-access-w575w\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.152392 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-run-httpd\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.204478 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d190ccc8-5a25-4362-9b04-b1381bdff384" path="/var/lib/kubelet/pods/d190ccc8-5a25-4362-9b04-b1381bdff384/volumes" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.254644 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.254691 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-scripts\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.254766 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w575w\" (UniqueName: \"kubernetes.io/projected/3d56827f-8d2b-4f1e-a623-36b865f282d6-kube-api-access-w575w\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.254831 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-run-httpd\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.254863 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-log-httpd\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.254955 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-config-data\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.254983 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.255529 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-log-httpd\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.255597 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-run-httpd\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.259457 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-config-data\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.259570 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.260297 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.260339 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-scripts\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.276560 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w575w\" (UniqueName: \"kubernetes.io/projected/3d56827f-8d2b-4f1e-a623-36b865f282d6-kube-api-access-w575w\") pod \"ceilometer-0\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.424332 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.903782 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:25:32 crc kubenswrapper[4911]: I0310 14:25:32.981629 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerStarted","Data":"f5b0acdea9ec83b2a3b22bac5dcc090aba04154edaece99b4a3b87a6212188cd"} Mar 10 14:25:33 crc kubenswrapper[4911]: I0310 14:25:33.998605 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerStarted","Data":"97607404fb6fa193b39935df1b99dba949d260c588edf94d288757513a40b1f3"} Mar 10 14:25:35 crc kubenswrapper[4911]: I0310 14:25:35.017852 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerStarted","Data":"d2b9bfc16bf3c24d6ab95cc5111f2866b68518587ee5b4309371c9083df253e5"} Mar 10 14:25:36 crc kubenswrapper[4911]: I0310 14:25:36.041396 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerStarted","Data":"c8e7a328b8fe1b1c77334bc0dc99f9baabb5321c23a258c9ab2202910d51db76"} Mar 10 14:25:37 crc kubenswrapper[4911]: E0310 14:25:37.415832 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:37 crc kubenswrapper[4911]: E0310 14:25:37.418406 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:37 crc kubenswrapper[4911]: E0310 14:25:37.419587 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:37 crc kubenswrapper[4911]: E0310 14:25:37.419707 4911 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" Mar 10 14:25:38 crc kubenswrapper[4911]: I0310 14:25:38.068709 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerStarted","Data":"c5f4bd0e4b31d2e2feeb9bb46bd491108d8930c4da09276dacc07aeca5be0c6e"} Mar 10 14:25:38 crc kubenswrapper[4911]: I0310 14:25:38.068916 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 14:25:38 crc kubenswrapper[4911]: I0310 14:25:38.095194 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.738365325 podStartE2EDuration="6.095167182s" podCreationTimestamp="2026-03-10 14:25:32 +0000 UTC" firstStartedPulling="2026-03-10 14:25:32.908252819 +0000 UTC m=+1437.471772736" lastFinishedPulling="2026-03-10 14:25:37.265054676 +0000 UTC m=+1441.828574593" observedRunningTime="2026-03-10 14:25:38.090867198 +0000 UTC m=+1442.654387165" watchObservedRunningTime="2026-03-10 14:25:38.095167182 +0000 UTC m=+1442.658687109" Mar 10 14:25:42 crc kubenswrapper[4911]: E0310 14:25:42.414170 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:42 crc kubenswrapper[4911]: E0310 14:25:42.416330 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:42 crc kubenswrapper[4911]: E0310 14:25:42.420544 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:42 crc kubenswrapper[4911]: E0310 14:25:42.420633 4911 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" Mar 10 14:25:47 crc kubenswrapper[4911]: E0310 14:25:47.414577 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:47 crc kubenswrapper[4911]: E0310 14:25:47.416558 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:47 crc kubenswrapper[4911]: E0310 14:25:47.418008 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:47 crc kubenswrapper[4911]: E0310 14:25:47.418060 4911 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" Mar 10 14:25:52 crc kubenswrapper[4911]: E0310 14:25:52.414195 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:52 crc kubenswrapper[4911]: E0310 14:25:52.416810 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:52 crc kubenswrapper[4911]: E0310 14:25:52.418859 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:52 crc kubenswrapper[4911]: E0310 14:25:52.418948 4911 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" Mar 10 14:25:57 crc kubenswrapper[4911]: E0310 14:25:57.415192 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:57 crc kubenswrapper[4911]: E0310 14:25:57.418050 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:57 crc kubenswrapper[4911]: E0310 14:25:57.419995 4911 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 14:25:57 crc kubenswrapper[4911]: E0310 14:25:57.420134 4911 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.155355 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552546-rw8md"] Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.157328 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552546-rw8md" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.163280 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.164838 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.165920 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552546-rw8md"] Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.170488 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.265140 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7f98\" (UniqueName: \"kubernetes.io/projected/a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e-kube-api-access-n7f98\") pod \"auto-csr-approver-29552546-rw8md\" (UID: \"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e\") " pod="openshift-infra/auto-csr-approver-29552546-rw8md" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.320644 4911 generic.go:334] "Generic (PLEG): container finished" podID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" exitCode=137 Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.320783 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc","Type":"ContainerDied","Data":"397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3"} Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.321396 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc","Type":"ContainerDied","Data":"9da7d23f9ee4bee12c522054ab4672c0ff0861aa72ffe29d0a18b42dd5bf149c"} Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.321420 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9da7d23f9ee4bee12c522054ab4672c0ff0861aa72ffe29d0a18b42dd5bf149c" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.367128 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7f98\" (UniqueName: \"kubernetes.io/projected/a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e-kube-api-access-n7f98\") pod \"auto-csr-approver-29552546-rw8md\" (UID: \"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e\") " pod="openshift-infra/auto-csr-approver-29552546-rw8md" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.384906 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.389298 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7f98\" (UniqueName: \"kubernetes.io/projected/a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e-kube-api-access-n7f98\") pod \"auto-csr-approver-29552546-rw8md\" (UID: \"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e\") " pod="openshift-infra/auto-csr-approver-29552546-rw8md" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.502540 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552546-rw8md" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.571512 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-combined-ca-bundle\") pod \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.571574 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtws5\" (UniqueName: \"kubernetes.io/projected/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-kube-api-access-qtws5\") pod \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.571611 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-config-data\") pod \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\" (UID: \"db3a0241-0c0a-43c8-b76f-4c7a91c4edcc\") " Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.575607 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-kube-api-access-qtws5" (OuterVolumeSpecName: "kube-api-access-qtws5") pod "db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" (UID: "db3a0241-0c0a-43c8-b76f-4c7a91c4edcc"). InnerVolumeSpecName "kube-api-access-qtws5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.600643 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" (UID: "db3a0241-0c0a-43c8-b76f-4c7a91c4edcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.609325 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-config-data" (OuterVolumeSpecName: "config-data") pod "db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" (UID: "db3a0241-0c0a-43c8-b76f-4c7a91c4edcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.674322 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.674603 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtws5\" (UniqueName: \"kubernetes.io/projected/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-kube-api-access-qtws5\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.674622 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:00 crc kubenswrapper[4911]: I0310 14:26:00.978083 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552546-rw8md"] Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.330828 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552546-rw8md" event={"ID":"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e","Type":"ContainerStarted","Data":"386048696f14b561707ae9250dfe9a35ec0b4174c655e2663432107fc3195f30"} Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.330895 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.364312 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.403133 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.416450 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:26:01 crc kubenswrapper[4911]: E0310 14:26:01.416896 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.416925 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.417192 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" containerName="nova-cell0-conductor-conductor" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.418016 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.422105 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-69dl5" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.422194 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.445712 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.612637 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0163a2-67bb-4bff-b4c7-c525f764e808-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.612711 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af0163a2-67bb-4bff-b4c7-c525f764e808-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.612825 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pkfz\" (UniqueName: \"kubernetes.io/projected/af0163a2-67bb-4bff-b4c7-c525f764e808-kube-api-access-9pkfz\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.715053 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0163a2-67bb-4bff-b4c7-c525f764e808-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.715167 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af0163a2-67bb-4bff-b4c7-c525f764e808-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.715250 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pkfz\" (UniqueName: \"kubernetes.io/projected/af0163a2-67bb-4bff-b4c7-c525f764e808-kube-api-access-9pkfz\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.728831 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0163a2-67bb-4bff-b4c7-c525f764e808-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.730407 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af0163a2-67bb-4bff-b4c7-c525f764e808-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.734551 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pkfz\" (UniqueName: \"kubernetes.io/projected/af0163a2-67bb-4bff-b4c7-c525f764e808-kube-api-access-9pkfz\") pod \"nova-cell0-conductor-0\" (UID: \"af0163a2-67bb-4bff-b4c7-c525f764e808\") " pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:01 crc kubenswrapper[4911]: I0310 14:26:01.744501 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:02 crc kubenswrapper[4911]: I0310 14:26:02.216889 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3a0241-0c0a-43c8-b76f-4c7a91c4edcc" path="/var/lib/kubelet/pods/db3a0241-0c0a-43c8-b76f-4c7a91c4edcc/volumes" Mar 10 14:26:02 crc kubenswrapper[4911]: I0310 14:26:02.236402 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 14:26:02 crc kubenswrapper[4911]: W0310 14:26:02.259702 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf0163a2_67bb_4bff_b4c7_c525f764e808.slice/crio-011859c492ad83146aed1a17dff3f5b67aa8bc8db9ac60a50f1be37f270ddb39 WatchSource:0}: Error finding container 011859c492ad83146aed1a17dff3f5b67aa8bc8db9ac60a50f1be37f270ddb39: Status 404 returned error can't find the container with id 011859c492ad83146aed1a17dff3f5b67aa8bc8db9ac60a50f1be37f270ddb39 Mar 10 14:26:02 crc kubenswrapper[4911]: I0310 14:26:02.348817 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552546-rw8md" event={"ID":"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e","Type":"ContainerStarted","Data":"e1c86c8fa4589719e51bcea2266d9340cdfb3e3d41126c995121630935c5337e"} Mar 10 14:26:02 crc kubenswrapper[4911]: I0310 14:26:02.355050 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"af0163a2-67bb-4bff-b4c7-c525f764e808","Type":"ContainerStarted","Data":"011859c492ad83146aed1a17dff3f5b67aa8bc8db9ac60a50f1be37f270ddb39"} Mar 10 14:26:02 crc kubenswrapper[4911]: I0310 14:26:02.369807 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552546-rw8md" podStartSLOduration=1.34904449 podStartE2EDuration="2.369781494s" podCreationTimestamp="2026-03-10 14:26:00 +0000 UTC" firstStartedPulling="2026-03-10 14:26:00.984022239 +0000 UTC m=+1465.547542156" lastFinishedPulling="2026-03-10 14:26:02.004759243 +0000 UTC m=+1466.568279160" observedRunningTime="2026-03-10 14:26:02.368261594 +0000 UTC m=+1466.931781511" watchObservedRunningTime="2026-03-10 14:26:02.369781494 +0000 UTC m=+1466.933301411" Mar 10 14:26:02 crc kubenswrapper[4911]: I0310 14:26:02.430543 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 14:26:03 crc kubenswrapper[4911]: I0310 14:26:03.367393 4911 generic.go:334] "Generic (PLEG): container finished" podID="a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e" containerID="e1c86c8fa4589719e51bcea2266d9340cdfb3e3d41126c995121630935c5337e" exitCode=0 Mar 10 14:26:03 crc kubenswrapper[4911]: I0310 14:26:03.367452 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552546-rw8md" event={"ID":"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e","Type":"ContainerDied","Data":"e1c86c8fa4589719e51bcea2266d9340cdfb3e3d41126c995121630935c5337e"} Mar 10 14:26:03 crc kubenswrapper[4911]: I0310 14:26:03.370704 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"af0163a2-67bb-4bff-b4c7-c525f764e808","Type":"ContainerStarted","Data":"b43151a01be27c308ff524da9a8cfd3ace3bc7aebf06a05a61efc0d75c373ccd"} Mar 10 14:26:03 crc kubenswrapper[4911]: I0310 14:26:03.370936 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:03 crc kubenswrapper[4911]: I0310 14:26:03.414831 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.41480667 podStartE2EDuration="2.41480667s" podCreationTimestamp="2026-03-10 14:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:03.404947519 +0000 UTC m=+1467.968467436" watchObservedRunningTime="2026-03-10 14:26:03.41480667 +0000 UTC m=+1467.978326587" Mar 10 14:26:04 crc kubenswrapper[4911]: I0310 14:26:04.882190 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552546-rw8md" Mar 10 14:26:04 crc kubenswrapper[4911]: I0310 14:26:04.982964 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7f98\" (UniqueName: \"kubernetes.io/projected/a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e-kube-api-access-n7f98\") pod \"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e\" (UID: \"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e\") " Mar 10 14:26:04 crc kubenswrapper[4911]: I0310 14:26:04.990197 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e-kube-api-access-n7f98" (OuterVolumeSpecName: "kube-api-access-n7f98") pod "a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e" (UID: "a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e"). InnerVolumeSpecName "kube-api-access-n7f98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:05 crc kubenswrapper[4911]: I0310 14:26:05.085545 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7f98\" (UniqueName: \"kubernetes.io/projected/a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e-kube-api-access-n7f98\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:05 crc kubenswrapper[4911]: I0310 14:26:05.426130 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552546-rw8md" event={"ID":"a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e","Type":"ContainerDied","Data":"386048696f14b561707ae9250dfe9a35ec0b4174c655e2663432107fc3195f30"} Mar 10 14:26:05 crc kubenswrapper[4911]: I0310 14:26:05.426177 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552546-rw8md" Mar 10 14:26:05 crc kubenswrapper[4911]: I0310 14:26:05.426195 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="386048696f14b561707ae9250dfe9a35ec0b4174c655e2663432107fc3195f30" Mar 10 14:26:05 crc kubenswrapper[4911]: I0310 14:26:05.459818 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552540-bbrwg"] Mar 10 14:26:05 crc kubenswrapper[4911]: I0310 14:26:05.474163 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552540-bbrwg"] Mar 10 14:26:06 crc kubenswrapper[4911]: I0310 14:26:06.207387 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708857de-1db1-4764-8041-2bd173460cea" path="/var/lib/kubelet/pods/708857de-1db1-4764-8041-2bd173460cea/volumes" Mar 10 14:26:06 crc kubenswrapper[4911]: I0310 14:26:06.465006 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:26:06 crc kubenswrapper[4911]: I0310 14:26:06.465347 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="88c561c2-ba90-4331-8dc1-3098939b3b3c" containerName="kube-state-metrics" containerID="cri-o://e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0" gracePeriod=30 Mar 10 14:26:06 crc kubenswrapper[4911]: I0310 14:26:06.632133 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="88c561c2-ba90-4331-8dc1-3098939b3b3c" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": dial tcp 10.217.0.106:8081: connect: connection refused" Mar 10 14:26:06 crc kubenswrapper[4911]: I0310 14:26:06.916376 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.030599 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkvsw\" (UniqueName: \"kubernetes.io/projected/88c561c2-ba90-4331-8dc1-3098939b3b3c-kube-api-access-kkvsw\") pod \"88c561c2-ba90-4331-8dc1-3098939b3b3c\" (UID: \"88c561c2-ba90-4331-8dc1-3098939b3b3c\") " Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.037678 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c561c2-ba90-4331-8dc1-3098939b3b3c-kube-api-access-kkvsw" (OuterVolumeSpecName: "kube-api-access-kkvsw") pod "88c561c2-ba90-4331-8dc1-3098939b3b3c" (UID: "88c561c2-ba90-4331-8dc1-3098939b3b3c"). InnerVolumeSpecName "kube-api-access-kkvsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.135197 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkvsw\" (UniqueName: \"kubernetes.io/projected/88c561c2-ba90-4331-8dc1-3098939b3b3c-kube-api-access-kkvsw\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.450293 4911 generic.go:334] "Generic (PLEG): container finished" podID="88c561c2-ba90-4331-8dc1-3098939b3b3c" containerID="e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0" exitCode=2 Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.450343 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"88c561c2-ba90-4331-8dc1-3098939b3b3c","Type":"ContainerDied","Data":"e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0"} Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.450377 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"88c561c2-ba90-4331-8dc1-3098939b3b3c","Type":"ContainerDied","Data":"ab53cbb0a833a25a3edf807934855367260b314422a17469ab0934b7247e5928"} Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.450397 4911 scope.go:117] "RemoveContainer" containerID="e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.450520 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.494208 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.494709 4911 scope.go:117] "RemoveContainer" containerID="e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0" Mar 10 14:26:07 crc kubenswrapper[4911]: E0310 14:26:07.495304 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0\": container with ID starting with e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0 not found: ID does not exist" containerID="e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.495351 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0"} err="failed to get container status \"e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0\": rpc error: code = NotFound desc = could not find container \"e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0\": container with ID starting with e808f64397ff89e58cc9fbbce80c89777d0f3b7b2a6b9d0a5f6b81b82f6be9a0 not found: ID does not exist" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.514148 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.523274 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:26:07 crc kubenswrapper[4911]: E0310 14:26:07.524045 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c561c2-ba90-4331-8dc1-3098939b3b3c" containerName="kube-state-metrics" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.524075 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c561c2-ba90-4331-8dc1-3098939b3b3c" containerName="kube-state-metrics" Mar 10 14:26:07 crc kubenswrapper[4911]: E0310 14:26:07.524101 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e" containerName="oc" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.524109 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e" containerName="oc" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.524320 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e" containerName="oc" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.524357 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c561c2-ba90-4331-8dc1-3098939b3b3c" containerName="kube-state-metrics" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.525216 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.528882 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.528910 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.532666 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.647573 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.647711 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.647778 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.647810 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whf88\" (UniqueName: \"kubernetes.io/projected/cd645aa4-53be-4ede-a00b-e294626fc333-kube-api-access-whf88\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.750404 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.750543 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.750605 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whf88\" (UniqueName: \"kubernetes.io/projected/cd645aa4-53be-4ede-a00b-e294626fc333-kube-api-access-whf88\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.750755 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.765662 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.766359 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.776547 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd645aa4-53be-4ede-a00b-e294626fc333-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.783559 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whf88\" (UniqueName: \"kubernetes.io/projected/cd645aa4-53be-4ede-a00b-e294626fc333-kube-api-access-whf88\") pod \"kube-state-metrics-0\" (UID: \"cd645aa4-53be-4ede-a00b-e294626fc333\") " pod="openstack/kube-state-metrics-0" Mar 10 14:26:07 crc kubenswrapper[4911]: I0310 14:26:07.851987 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 14:26:08 crc kubenswrapper[4911]: I0310 14:26:08.204394 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c561c2-ba90-4331-8dc1-3098939b3b3c" path="/var/lib/kubelet/pods/88c561c2-ba90-4331-8dc1-3098939b3b3c/volumes" Mar 10 14:26:08 crc kubenswrapper[4911]: I0310 14:26:08.464947 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 14:26:08 crc kubenswrapper[4911]: I0310 14:26:08.627424 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:08 crc kubenswrapper[4911]: I0310 14:26:08.628324 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="ceilometer-central-agent" containerID="cri-o://97607404fb6fa193b39935df1b99dba949d260c588edf94d288757513a40b1f3" gracePeriod=30 Mar 10 14:26:08 crc kubenswrapper[4911]: I0310 14:26:08.628403 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="sg-core" containerID="cri-o://c8e7a328b8fe1b1c77334bc0dc99f9baabb5321c23a258c9ab2202910d51db76" gracePeriod=30 Mar 10 14:26:08 crc kubenswrapper[4911]: I0310 14:26:08.628442 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="ceilometer-notification-agent" containerID="cri-o://d2b9bfc16bf3c24d6ab95cc5111f2866b68518587ee5b4309371c9083df253e5" gracePeriod=30 Mar 10 14:26:08 crc kubenswrapper[4911]: I0310 14:26:08.628449 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="proxy-httpd" containerID="cri-o://c5f4bd0e4b31d2e2feeb9bb46bd491108d8930c4da09276dacc07aeca5be0c6e" gracePeriod=30 Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.475975 4911 generic.go:334] "Generic (PLEG): container finished" podID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerID="c5f4bd0e4b31d2e2feeb9bb46bd491108d8930c4da09276dacc07aeca5be0c6e" exitCode=0 Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.476346 4911 generic.go:334] "Generic (PLEG): container finished" podID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerID="c8e7a328b8fe1b1c77334bc0dc99f9baabb5321c23a258c9ab2202910d51db76" exitCode=2 Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.476369 4911 generic.go:334] "Generic (PLEG): container finished" podID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerID="97607404fb6fa193b39935df1b99dba949d260c588edf94d288757513a40b1f3" exitCode=0 Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.476059 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerDied","Data":"c5f4bd0e4b31d2e2feeb9bb46bd491108d8930c4da09276dacc07aeca5be0c6e"} Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.476536 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerDied","Data":"c8e7a328b8fe1b1c77334bc0dc99f9baabb5321c23a258c9ab2202910d51db76"} Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.476576 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerDied","Data":"97607404fb6fa193b39935df1b99dba949d260c588edf94d288757513a40b1f3"} Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.481707 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd645aa4-53be-4ede-a00b-e294626fc333","Type":"ContainerStarted","Data":"b946d7644704db0af835010d6e99e319d9e9d2e85feacb20a7bab64b4656488d"} Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.481782 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd645aa4-53be-4ede-a00b-e294626fc333","Type":"ContainerStarted","Data":"997e3f4cc9bfab16376228b47dfa27fab6e314eb5f34dd68277f05e26a36e13e"} Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.482517 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 14:26:09 crc kubenswrapper[4911]: I0310 14:26:09.505506 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.15502415 podStartE2EDuration="2.505482706s" podCreationTimestamp="2026-03-10 14:26:07 +0000 UTC" firstStartedPulling="2026-03-10 14:26:08.481995239 +0000 UTC m=+1473.045515156" lastFinishedPulling="2026-03-10 14:26:08.832453795 +0000 UTC m=+1473.395973712" observedRunningTime="2026-03-10 14:26:09.503035091 +0000 UTC m=+1474.066555018" watchObservedRunningTime="2026-03-10 14:26:09.505482706 +0000 UTC m=+1474.069002633" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.509168 4911 generic.go:334] "Generic (PLEG): container finished" podID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerID="d2b9bfc16bf3c24d6ab95cc5111f2866b68518587ee5b4309371c9083df253e5" exitCode=0 Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.509978 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerDied","Data":"d2b9bfc16bf3c24d6ab95cc5111f2866b68518587ee5b4309371c9083df253e5"} Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.510025 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d56827f-8d2b-4f1e-a623-36b865f282d6","Type":"ContainerDied","Data":"f5b0acdea9ec83b2a3b22bac5dcc090aba04154edaece99b4a3b87a6212188cd"} Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.510045 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b0acdea9ec83b2a3b22bac5dcc090aba04154edaece99b4a3b87a6212188cd" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.571030 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.654929 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-combined-ca-bundle\") pod \"3d56827f-8d2b-4f1e-a623-36b865f282d6\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.655107 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-config-data\") pod \"3d56827f-8d2b-4f1e-a623-36b865f282d6\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.655157 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-run-httpd\") pod \"3d56827f-8d2b-4f1e-a623-36b865f282d6\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.655213 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-sg-core-conf-yaml\") pod \"3d56827f-8d2b-4f1e-a623-36b865f282d6\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.655305 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w575w\" (UniqueName: \"kubernetes.io/projected/3d56827f-8d2b-4f1e-a623-36b865f282d6-kube-api-access-w575w\") pod \"3d56827f-8d2b-4f1e-a623-36b865f282d6\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.655360 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-scripts\") pod \"3d56827f-8d2b-4f1e-a623-36b865f282d6\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.655388 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-log-httpd\") pod \"3d56827f-8d2b-4f1e-a623-36b865f282d6\" (UID: \"3d56827f-8d2b-4f1e-a623-36b865f282d6\") " Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.656214 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d56827f-8d2b-4f1e-a623-36b865f282d6" (UID: "3d56827f-8d2b-4f1e-a623-36b865f282d6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.656608 4911 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.657482 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d56827f-8d2b-4f1e-a623-36b865f282d6" (UID: "3d56827f-8d2b-4f1e-a623-36b865f282d6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.662800 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-scripts" (OuterVolumeSpecName: "scripts") pod "3d56827f-8d2b-4f1e-a623-36b865f282d6" (UID: "3d56827f-8d2b-4f1e-a623-36b865f282d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.667335 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d56827f-8d2b-4f1e-a623-36b865f282d6-kube-api-access-w575w" (OuterVolumeSpecName: "kube-api-access-w575w") pod "3d56827f-8d2b-4f1e-a623-36b865f282d6" (UID: "3d56827f-8d2b-4f1e-a623-36b865f282d6"). InnerVolumeSpecName "kube-api-access-w575w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.726311 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d56827f-8d2b-4f1e-a623-36b865f282d6" (UID: "3d56827f-8d2b-4f1e-a623-36b865f282d6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.759578 4911 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d56827f-8d2b-4f1e-a623-36b865f282d6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.760098 4911 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.760114 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w575w\" (UniqueName: \"kubernetes.io/projected/3d56827f-8d2b-4f1e-a623-36b865f282d6-kube-api-access-w575w\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.760128 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.774200 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d56827f-8d2b-4f1e-a623-36b865f282d6" (UID: "3d56827f-8d2b-4f1e-a623-36b865f282d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.786449 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-config-data" (OuterVolumeSpecName: "config-data") pod "3d56827f-8d2b-4f1e-a623-36b865f282d6" (UID: "3d56827f-8d2b-4f1e-a623-36b865f282d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.791514 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.862444 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:11 crc kubenswrapper[4911]: I0310 14:26:11.862489 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d56827f-8d2b-4f1e-a623-36b865f282d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.365508 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cpd4g"] Mar 10 14:26:12 crc kubenswrapper[4911]: E0310 14:26:12.366093 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="ceilometer-central-agent" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.366119 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="ceilometer-central-agent" Mar 10 14:26:12 crc kubenswrapper[4911]: E0310 14:26:12.366146 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="proxy-httpd" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.366155 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="proxy-httpd" Mar 10 14:26:12 crc kubenswrapper[4911]: E0310 14:26:12.366185 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="ceilometer-notification-agent" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.366193 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="ceilometer-notification-agent" Mar 10 14:26:12 crc kubenswrapper[4911]: E0310 14:26:12.366216 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="sg-core" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.366225 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="sg-core" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.366472 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="sg-core" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.366496 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="ceilometer-notification-agent" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.366525 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="proxy-httpd" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.366540 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" containerName="ceilometer-central-agent" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.367444 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.372354 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.376690 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.395888 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cpd4g"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.477482 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.477567 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-scripts\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.477672 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427wq\" (UniqueName: \"kubernetes.io/projected/b6554330-1024-4d85-8b5b-f7f354c9631d-kube-api-access-427wq\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.477698 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-config-data\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.524262 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.574795 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.580157 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-427wq\" (UniqueName: \"kubernetes.io/projected/b6554330-1024-4d85-8b5b-f7f354c9631d-kube-api-access-427wq\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.580228 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-config-data\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.580285 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.580318 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-scripts\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.591017 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-config-data\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.592136 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.595493 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.603161 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-scripts\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.621696 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-427wq\" (UniqueName: \"kubernetes.io/projected/b6554330-1024-4d85-8b5b-f7f354c9631d-kube-api-access-427wq\") pod \"nova-cell0-cell-mapping-cpd4g\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.628064 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.630662 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.637317 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.637553 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.641169 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.647069 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.657325 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.672516 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.685897 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.689095 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.703500 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.727560 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.706193 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.738155 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.752461 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787121 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787170 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ee0d96-e3d8-49ee-beae-456329344c5f-logs\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787202 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8144e45e-0b01-4199-ae1f-560e101f7eab-logs\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787226 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787249 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787285 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-log-httpd\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787305 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfjn4\" (UniqueName: \"kubernetes.io/projected/8144e45e-0b01-4199-ae1f-560e101f7eab-kube-api-access-nfjn4\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787321 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-run-httpd\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787347 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-config-data\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787370 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr7xj\" (UniqueName: \"kubernetes.io/projected/e4ee0d96-e3d8-49ee-beae-456329344c5f-kube-api-access-sr7xj\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787392 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-config-data\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787413 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787455 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwpql\" (UniqueName: \"kubernetes.io/projected/067898a3-878b-4cf6-8780-2e6692f31765-kube-api-access-nwpql\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787480 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-scripts\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787497 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-config-data\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.787521 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.854808 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-cc8d6"] Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.856786 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.890218 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.892863 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwpql\" (UniqueName: \"kubernetes.io/projected/067898a3-878b-4cf6-8780-2e6692f31765-kube-api-access-nwpql\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.892957 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-scripts\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.892999 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-config-data\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893055 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893248 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893330 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ee0d96-e3d8-49ee-beae-456329344c5f-logs\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893415 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8144e45e-0b01-4199-ae1f-560e101f7eab-logs\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893440 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893495 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893562 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-log-httpd\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893609 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfjn4\" (UniqueName: \"kubernetes.io/projected/8144e45e-0b01-4199-ae1f-560e101f7eab-kube-api-access-nfjn4\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893635 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-run-httpd\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893715 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-config-data\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893766 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr7xj\" (UniqueName: \"kubernetes.io/projected/e4ee0d96-e3d8-49ee-beae-456329344c5f-kube-api-access-sr7xj\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.893804 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-config-data\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.896037 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8144e45e-0b01-4199-ae1f-560e101f7eab-logs\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.908299 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-scripts\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.909105 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ee0d96-e3d8-49ee-beae-456329344c5f-logs\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.915071 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.927333 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-log-httpd\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.935863 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.939114 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-run-httpd\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.939614 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.940381 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.945194 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.947573 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfjn4\" (UniqueName: \"kubernetes.io/projected/8144e45e-0b01-4199-ae1f-560e101f7eab-kube-api-access-nfjn4\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.948238 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-config-data\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.948926 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr7xj\" (UniqueName: \"kubernetes.io/projected/e4ee0d96-e3d8-49ee-beae-456329344c5f-kube-api-access-sr7xj\") pod \"nova-metadata-0\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " pod="openstack/nova-metadata-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.949335 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-config-data\") pod \"nova-api-0\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " pod="openstack/nova-api-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.950534 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-config-data\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.951610 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwpql\" (UniqueName: \"kubernetes.io/projected/067898a3-878b-4cf6-8780-2e6692f31765-kube-api-access-nwpql\") pod \"ceilometer-0\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " pod="openstack/ceilometer-0" Mar 10 14:26:12 crc kubenswrapper[4911]: I0310 14:26:12.951668 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-cc8d6"] Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:12.997043 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg6st\" (UniqueName: \"kubernetes.io/projected/61b95124-a31f-486a-8350-592dd5661c01-kube-api-access-cg6st\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:12.997162 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:12.997206 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-svc\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:12.997236 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:12.997279 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:12.997328 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-config\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.017981 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.019522 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.027642 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.029624 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.030466 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.052479 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.069002 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.075272 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.079120 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.086094 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.097011 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098444 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc6xt\" (UniqueName: \"kubernetes.io/projected/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-kube-api-access-mc6xt\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098483 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098542 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg6st\" (UniqueName: \"kubernetes.io/projected/61b95124-a31f-486a-8350-592dd5661c01-kube-api-access-cg6st\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098576 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098611 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098637 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-svc\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098662 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098742 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.098797 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-config\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.099899 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-config\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.100891 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.101447 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-svc\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.102064 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.106991 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.129912 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg6st\" (UniqueName: \"kubernetes.io/projected/61b95124-a31f-486a-8350-592dd5661c01-kube-api-access-cg6st\") pod \"dnsmasq-dns-757b4f8459-cc8d6\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.201600 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc6xt\" (UniqueName: \"kubernetes.io/projected/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-kube-api-access-mc6xt\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.201681 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.201783 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwngj\" (UniqueName: \"kubernetes.io/projected/56f50b77-e8db-4942-ab68-80013c8bb306-kube-api-access-nwngj\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.201811 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.201865 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.202006 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-config-data\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.208891 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.220437 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.228115 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc6xt\" (UniqueName: \"kubernetes.io/projected/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-kube-api-access-mc6xt\") pod \"nova-cell1-novncproxy-0\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.304751 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwngj\" (UniqueName: \"kubernetes.io/projected/56f50b77-e8db-4942-ab68-80013c8bb306-kube-api-access-nwngj\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.304875 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.305229 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-config-data\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.312514 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.327295 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-config-data\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.335063 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwngj\" (UniqueName: \"kubernetes.io/projected/56f50b77-e8db-4942-ab68-80013c8bb306-kube-api-access-nwngj\") pod \"nova-scheduler-0\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.377924 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.380627 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.422445 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.484235 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cpd4g"] Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.676461 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cpd4g" event={"ID":"b6554330-1024-4d85-8b5b-f7f354c9631d","Type":"ContainerStarted","Data":"0ee335d3fe51a04a626d5c3d7faf7b325fa721ff7b95edd3ad4b1846f5be2929"} Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.788315 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:13 crc kubenswrapper[4911]: I0310 14:26:13.943621 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.023229 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:14 crc kubenswrapper[4911]: W0310 14:26:14.094068 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8144e45e_0b01_4199_ae1f_560e101f7eab.slice/crio-e9166f03a87c1d3c22203f24c46034a2838abc50d0dae740ed24d70fe002b2e1 WatchSource:0}: Error finding container e9166f03a87c1d3c22203f24c46034a2838abc50d0dae740ed24d70fe002b2e1: Status 404 returned error can't find the container with id e9166f03a87c1d3c22203f24c46034a2838abc50d0dae740ed24d70fe002b2e1 Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.190168 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p89dc"] Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.193802 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.200571 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.201008 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.210419 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d56827f-8d2b-4f1e-a623-36b865f282d6" path="/var/lib/kubelet/pods/3d56827f-8d2b-4f1e-a623-36b865f282d6/volumes" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.216074 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p89dc"] Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.267408 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-config-data\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.267889 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-scripts\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.269093 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j294f\" (UniqueName: \"kubernetes.io/projected/1d266d46-03f6-4f16-bf8e-fc44da521e64-kube-api-access-j294f\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.269147 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: W0310 14:26:14.331118 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba983b1b_2e11_4bc2_bb85_0e10ec390df9.slice/crio-ba912882ee742c372bc8f24288b0caf88010a944a794313bdb9c2ae67afe841a WatchSource:0}: Error finding container ba912882ee742c372bc8f24288b0caf88010a944a794313bdb9c2ae67afe841a: Status 404 returned error can't find the container with id ba912882ee742c372bc8f24288b0caf88010a944a794313bdb9c2ae67afe841a Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.361328 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.380286 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j294f\" (UniqueName: \"kubernetes.io/projected/1d266d46-03f6-4f16-bf8e-fc44da521e64-kube-api-access-j294f\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.380339 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.380383 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-config-data\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.380428 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-scripts\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.393764 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-config-data\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.394476 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.398440 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-scripts\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.403063 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j294f\" (UniqueName: \"kubernetes.io/projected/1d266d46-03f6-4f16-bf8e-fc44da521e64-kube-api-access-j294f\") pod \"nova-cell1-conductor-db-sync-p89dc\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.415170 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-cc8d6"] Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.481081 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.517559 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.706709 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerStarted","Data":"e2ffa80d39f34f1f07d3be320c2aed47c8398d877094584ca12d3c8153459193"} Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.712753 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" event={"ID":"61b95124-a31f-486a-8350-592dd5661c01","Type":"ContainerStarted","Data":"8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601"} Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.712781 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" event={"ID":"61b95124-a31f-486a-8350-592dd5661c01","Type":"ContainerStarted","Data":"89a9098b330d1d36a68405faac7d2c0e836afabc05e0e13acb6eb428f961a868"} Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.734070 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cpd4g" event={"ID":"b6554330-1024-4d85-8b5b-f7f354c9631d","Type":"ContainerStarted","Data":"ad27ddd0f31c25b02ce6d5636c783c77dfb05417e3613d6a844c9ff4b6b454b6"} Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.740706 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4ee0d96-e3d8-49ee-beae-456329344c5f","Type":"ContainerStarted","Data":"cf76dd92a3bd79e26ce7b0283d695b18e8e0f2f410f640d5688f07ece418f3ff"} Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.785067 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cpd4g" podStartSLOduration=2.7850442170000003 podStartE2EDuration="2.785044217s" podCreationTimestamp="2026-03-10 14:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:14.772961606 +0000 UTC m=+1479.336481523" watchObservedRunningTime="2026-03-10 14:26:14.785044217 +0000 UTC m=+1479.348564134" Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.799206 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8144e45e-0b01-4199-ae1f-560e101f7eab","Type":"ContainerStarted","Data":"e9166f03a87c1d3c22203f24c46034a2838abc50d0dae740ed24d70fe002b2e1"} Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.804273 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ba983b1b-2e11-4bc2-bb85-0e10ec390df9","Type":"ContainerStarted","Data":"ba912882ee742c372bc8f24288b0caf88010a944a794313bdb9c2ae67afe841a"} Mar 10 14:26:14 crc kubenswrapper[4911]: I0310 14:26:14.807284 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"56f50b77-e8db-4942-ab68-80013c8bb306","Type":"ContainerStarted","Data":"28498e101a02c0667011b7fcabf0bf9c61d99480693898207a97a94d2f9f28eb"} Mar 10 14:26:15 crc kubenswrapper[4911]: W0310 14:26:15.055959 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d266d46_03f6_4f16_bf8e_fc44da521e64.slice/crio-167029b7ad0d70aa730e1928fe63108f82d054992af20c06492675853d6e48cc WatchSource:0}: Error finding container 167029b7ad0d70aa730e1928fe63108f82d054992af20c06492675853d6e48cc: Status 404 returned error can't find the container with id 167029b7ad0d70aa730e1928fe63108f82d054992af20c06492675853d6e48cc Mar 10 14:26:15 crc kubenswrapper[4911]: I0310 14:26:15.067109 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p89dc"] Mar 10 14:26:15 crc kubenswrapper[4911]: I0310 14:26:15.823303 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p89dc" event={"ID":"1d266d46-03f6-4f16-bf8e-fc44da521e64","Type":"ContainerStarted","Data":"d19318cfdd4d30c2259ddec6e16cededfb0e4291ee74cebe9fca767f6c2364b2"} Mar 10 14:26:15 crc kubenswrapper[4911]: I0310 14:26:15.824502 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p89dc" event={"ID":"1d266d46-03f6-4f16-bf8e-fc44da521e64","Type":"ContainerStarted","Data":"167029b7ad0d70aa730e1928fe63108f82d054992af20c06492675853d6e48cc"} Mar 10 14:26:15 crc kubenswrapper[4911]: I0310 14:26:15.828379 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerStarted","Data":"b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f"} Mar 10 14:26:15 crc kubenswrapper[4911]: I0310 14:26:15.828404 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerStarted","Data":"5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83"} Mar 10 14:26:15 crc kubenswrapper[4911]: I0310 14:26:15.835882 4911 generic.go:334] "Generic (PLEG): container finished" podID="61b95124-a31f-486a-8350-592dd5661c01" containerID="8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601" exitCode=0 Mar 10 14:26:15 crc kubenswrapper[4911]: I0310 14:26:15.837095 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" event={"ID":"61b95124-a31f-486a-8350-592dd5661c01","Type":"ContainerDied","Data":"8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601"} Mar 10 14:26:15 crc kubenswrapper[4911]: I0310 14:26:15.850124 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-p89dc" podStartSLOduration=1.8500956450000001 podStartE2EDuration="1.850095645s" podCreationTimestamp="2026-03-10 14:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:15.84045702 +0000 UTC m=+1480.403976957" watchObservedRunningTime="2026-03-10 14:26:15.850095645 +0000 UTC m=+1480.413615562" Mar 10 14:26:16 crc kubenswrapper[4911]: I0310 14:26:16.435857 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:16 crc kubenswrapper[4911]: I0310 14:26:16.462085 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:17 crc kubenswrapper[4911]: I0310 14:26:17.868137 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.881659 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4ee0d96-e3d8-49ee-beae-456329344c5f","Type":"ContainerStarted","Data":"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd"} Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.882265 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4ee0d96-e3d8-49ee-beae-456329344c5f","Type":"ContainerStarted","Data":"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90"} Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.881860 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerName="nova-metadata-metadata" containerID="cri-o://11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd" gracePeriod=30 Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.881814 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerName="nova-metadata-log" containerID="cri-o://128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90" gracePeriod=30 Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.884364 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8144e45e-0b01-4199-ae1f-560e101f7eab","Type":"ContainerStarted","Data":"e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b"} Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.884393 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8144e45e-0b01-4199-ae1f-560e101f7eab","Type":"ContainerStarted","Data":"17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b"} Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.913793 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ba983b1b-2e11-4bc2-bb85-0e10ec390df9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02" gracePeriod=30 Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.913692 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ba983b1b-2e11-4bc2-bb85-0e10ec390df9","Type":"ContainerStarted","Data":"24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02"} Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.917624 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"56f50b77-e8db-4942-ab68-80013c8bb306","Type":"ContainerStarted","Data":"8d8bd8f73633bc12932fc53b27db2b40cf833d23604859e2fd83bafa920fadfb"} Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.931784 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerStarted","Data":"0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad"} Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.933163 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.303628521 podStartE2EDuration="7.933141111s" podCreationTimestamp="2026-03-10 14:26:12 +0000 UTC" firstStartedPulling="2026-03-10 14:26:13.88935259 +0000 UTC m=+1478.452872507" lastFinishedPulling="2026-03-10 14:26:18.51886518 +0000 UTC m=+1483.082385097" observedRunningTime="2026-03-10 14:26:19.920805104 +0000 UTC m=+1484.484325021" watchObservedRunningTime="2026-03-10 14:26:19.933141111 +0000 UTC m=+1484.496661028" Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.941124 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" event={"ID":"61b95124-a31f-486a-8350-592dd5661c01","Type":"ContainerStarted","Data":"c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183"} Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.941675 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.953919 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.593197472 podStartE2EDuration="7.953881221s" podCreationTimestamp="2026-03-10 14:26:12 +0000 UTC" firstStartedPulling="2026-03-10 14:26:14.098449376 +0000 UTC m=+1478.661969293" lastFinishedPulling="2026-03-10 14:26:18.459133125 +0000 UTC m=+1483.022653042" observedRunningTime="2026-03-10 14:26:19.949410033 +0000 UTC m=+1484.512929950" watchObservedRunningTime="2026-03-10 14:26:19.953881221 +0000 UTC m=+1484.517401138" Mar 10 14:26:19 crc kubenswrapper[4911]: I0310 14:26:19.979526 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.960052381 podStartE2EDuration="7.979503991s" podCreationTimestamp="2026-03-10 14:26:12 +0000 UTC" firstStartedPulling="2026-03-10 14:26:14.506348294 +0000 UTC m=+1479.069868211" lastFinishedPulling="2026-03-10 14:26:18.525799904 +0000 UTC m=+1483.089319821" observedRunningTime="2026-03-10 14:26:19.975649709 +0000 UTC m=+1484.539169626" watchObservedRunningTime="2026-03-10 14:26:19.979503991 +0000 UTC m=+1484.543023908" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.012738 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.821204229 podStartE2EDuration="8.012701371s" podCreationTimestamp="2026-03-10 14:26:12 +0000 UTC" firstStartedPulling="2026-03-10 14:26:14.334941098 +0000 UTC m=+1478.898461015" lastFinishedPulling="2026-03-10 14:26:18.52643822 +0000 UTC m=+1483.089958157" observedRunningTime="2026-03-10 14:26:20.011322505 +0000 UTC m=+1484.574842422" watchObservedRunningTime="2026-03-10 14:26:20.012701371 +0000 UTC m=+1484.576221288" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.055504 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" podStartSLOduration=8.055481756 podStartE2EDuration="8.055481756s" podCreationTimestamp="2026-03-10 14:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:20.053331099 +0000 UTC m=+1484.616851006" watchObservedRunningTime="2026-03-10 14:26:20.055481756 +0000 UTC m=+1484.619001673" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.815915 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.892458 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-combined-ca-bundle\") pod \"e4ee0d96-e3d8-49ee-beae-456329344c5f\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.892975 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr7xj\" (UniqueName: \"kubernetes.io/projected/e4ee0d96-e3d8-49ee-beae-456329344c5f-kube-api-access-sr7xj\") pod \"e4ee0d96-e3d8-49ee-beae-456329344c5f\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.893173 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-config-data\") pod \"e4ee0d96-e3d8-49ee-beae-456329344c5f\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.893306 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ee0d96-e3d8-49ee-beae-456329344c5f-logs\") pod \"e4ee0d96-e3d8-49ee-beae-456329344c5f\" (UID: \"e4ee0d96-e3d8-49ee-beae-456329344c5f\") " Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.897249 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ee0d96-e3d8-49ee-beae-456329344c5f-logs" (OuterVolumeSpecName: "logs") pod "e4ee0d96-e3d8-49ee-beae-456329344c5f" (UID: "e4ee0d96-e3d8-49ee-beae-456329344c5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.916451 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ee0d96-e3d8-49ee-beae-456329344c5f-kube-api-access-sr7xj" (OuterVolumeSpecName: "kube-api-access-sr7xj") pod "e4ee0d96-e3d8-49ee-beae-456329344c5f" (UID: "e4ee0d96-e3d8-49ee-beae-456329344c5f"). InnerVolumeSpecName "kube-api-access-sr7xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.932344 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4ee0d96-e3d8-49ee-beae-456329344c5f" (UID: "e4ee0d96-e3d8-49ee-beae-456329344c5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.962956 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-config-data" (OuterVolumeSpecName: "config-data") pod "e4ee0d96-e3d8-49ee-beae-456329344c5f" (UID: "e4ee0d96-e3d8-49ee-beae-456329344c5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.970157 4911 generic.go:334] "Generic (PLEG): container finished" podID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerID="11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd" exitCode=0 Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.970194 4911 generic.go:334] "Generic (PLEG): container finished" podID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerID="128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90" exitCode=143 Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.970286 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.970349 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4ee0d96-e3d8-49ee-beae-456329344c5f","Type":"ContainerDied","Data":"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd"} Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.970401 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4ee0d96-e3d8-49ee-beae-456329344c5f","Type":"ContainerDied","Data":"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90"} Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.970413 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4ee0d96-e3d8-49ee-beae-456329344c5f","Type":"ContainerDied","Data":"cf76dd92a3bd79e26ce7b0283d695b18e8e0f2f410f640d5688f07ece418f3ff"} Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.970431 4911 scope.go:117] "RemoveContainer" containerID="11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.997253 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ee0d96-e3d8-49ee-beae-456329344c5f-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.997318 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.997329 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr7xj\" (UniqueName: \"kubernetes.io/projected/e4ee0d96-e3d8-49ee-beae-456329344c5f-kube-api-access-sr7xj\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:20 crc kubenswrapper[4911]: I0310 14:26:20.997338 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ee0d96-e3d8-49ee-beae-456329344c5f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.113406 4911 scope.go:117] "RemoveContainer" containerID="128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.129538 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.157080 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.171711 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:21 crc kubenswrapper[4911]: E0310 14:26:21.172398 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerName="nova-metadata-log" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.172469 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerName="nova-metadata-log" Mar 10 14:26:21 crc kubenswrapper[4911]: E0310 14:26:21.172566 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerName="nova-metadata-metadata" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.172638 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerName="nova-metadata-metadata" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.172991 4911 scope.go:117] "RemoveContainer" containerID="11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.173009 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerName="nova-metadata-metadata" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.173126 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" containerName="nova-metadata-log" Mar 10 14:26:21 crc kubenswrapper[4911]: E0310 14:26:21.174391 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd\": container with ID starting with 11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd not found: ID does not exist" containerID="11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.174489 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd"} err="failed to get container status \"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd\": rpc error: code = NotFound desc = could not find container \"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd\": container with ID starting with 11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd not found: ID does not exist" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.174678 4911 scope.go:117] "RemoveContainer" containerID="128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.174608 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.181301 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.181637 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.185711 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:21 crc kubenswrapper[4911]: E0310 14:26:21.189777 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90\": container with ID starting with 128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90 not found: ID does not exist" containerID="128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.189937 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90"} err="failed to get container status \"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90\": rpc error: code = NotFound desc = could not find container \"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90\": container with ID starting with 128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90 not found: ID does not exist" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.190042 4911 scope.go:117] "RemoveContainer" containerID="11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.190495 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd"} err="failed to get container status \"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd\": rpc error: code = NotFound desc = could not find container \"11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd\": container with ID starting with 11a02d10967a20e41a0c5b0bbf4da0dc9435d65faf6faef300406c67d16013fd not found: ID does not exist" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.190560 4911 scope.go:117] "RemoveContainer" containerID="128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.195516 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90"} err="failed to get container status \"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90\": rpc error: code = NotFound desc = could not find container \"128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90\": container with ID starting with 128f2dbd2f8961826d6348ec7178a10db9eb5149252703e61abeec62277f3a90 not found: ID does not exist" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.307604 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.307718 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.307825 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-config-data\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.309040 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkl8d\" (UniqueName: \"kubernetes.io/projected/55164988-7848-4980-89d7-bd4c4637b193-kube-api-access-tkl8d\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.309275 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55164988-7848-4980-89d7-bd4c4637b193-logs\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.411328 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.411404 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.411456 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-config-data\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.411537 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkl8d\" (UniqueName: \"kubernetes.io/projected/55164988-7848-4980-89d7-bd4c4637b193-kube-api-access-tkl8d\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.411636 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55164988-7848-4980-89d7-bd4c4637b193-logs\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.412218 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55164988-7848-4980-89d7-bd4c4637b193-logs\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.419176 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.419316 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.419668 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-config-data\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.438224 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkl8d\" (UniqueName: \"kubernetes.io/projected/55164988-7848-4980-89d7-bd4c4637b193-kube-api-access-tkl8d\") pod \"nova-metadata-0\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.515957 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.982005 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55164988-7848-4980-89d7-bd4c4637b193","Type":"ContainerStarted","Data":"fb4635bea76139cde825ed879eb7c3bc39c9fe86826c2d3179beb3db44c40124"} Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.990260 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerStarted","Data":"6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4"} Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.991304 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 14:26:21 crc kubenswrapper[4911]: I0310 14:26:21.991593 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:22 crc kubenswrapper[4911]: I0310 14:26:22.020491 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.188837886 podStartE2EDuration="10.020459753s" podCreationTimestamp="2026-03-10 14:26:12 +0000 UTC" firstStartedPulling="2026-03-10 14:26:14.093380351 +0000 UTC m=+1478.656900268" lastFinishedPulling="2026-03-10 14:26:20.925002218 +0000 UTC m=+1485.488522135" observedRunningTime="2026-03-10 14:26:22.013020256 +0000 UTC m=+1486.576540183" watchObservedRunningTime="2026-03-10 14:26:22.020459753 +0000 UTC m=+1486.583979680" Mar 10 14:26:22 crc kubenswrapper[4911]: I0310 14:26:22.234129 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ee0d96-e3d8-49ee-beae-456329344c5f" path="/var/lib/kubelet/pods/e4ee0d96-e3d8-49ee-beae-456329344c5f/volumes" Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.006081 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55164988-7848-4980-89d7-bd4c4637b193","Type":"ContainerStarted","Data":"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45"} Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.006171 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55164988-7848-4980-89d7-bd4c4637b193","Type":"ContainerStarted","Data":"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e"} Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.034199 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.034172411 podStartE2EDuration="2.034172411s" podCreationTimestamp="2026-03-10 14:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:23.024644378 +0000 UTC m=+1487.588164295" watchObservedRunningTime="2026-03-10 14:26:23.034172411 +0000 UTC m=+1487.597692328" Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.098539 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.098617 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.379779 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.422739 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.422853 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 14:26:23 crc kubenswrapper[4911]: I0310 14:26:23.453514 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 14:26:24 crc kubenswrapper[4911]: I0310 14:26:24.018340 4911 generic.go:334] "Generic (PLEG): container finished" podID="1d266d46-03f6-4f16-bf8e-fc44da521e64" containerID="d19318cfdd4d30c2259ddec6e16cededfb0e4291ee74cebe9fca767f6c2364b2" exitCode=0 Mar 10 14:26:24 crc kubenswrapper[4911]: I0310 14:26:24.018402 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p89dc" event={"ID":"1d266d46-03f6-4f16-bf8e-fc44da521e64","Type":"ContainerDied","Data":"d19318cfdd4d30c2259ddec6e16cededfb0e4291ee74cebe9fca767f6c2364b2"} Mar 10 14:26:24 crc kubenswrapper[4911]: I0310 14:26:24.020845 4911 generic.go:334] "Generic (PLEG): container finished" podID="b6554330-1024-4d85-8b5b-f7f354c9631d" containerID="ad27ddd0f31c25b02ce6d5636c783c77dfb05417e3613d6a844c9ff4b6b454b6" exitCode=0 Mar 10 14:26:24 crc kubenswrapper[4911]: I0310 14:26:24.021680 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cpd4g" event={"ID":"b6554330-1024-4d85-8b5b-f7f354c9631d","Type":"ContainerDied","Data":"ad27ddd0f31c25b02ce6d5636c783c77dfb05417e3613d6a844c9ff4b6b454b6"} Mar 10 14:26:24 crc kubenswrapper[4911]: I0310 14:26:24.055388 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 14:26:24 crc kubenswrapper[4911]: I0310 14:26:24.181031 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 14:26:24 crc kubenswrapper[4911]: I0310 14:26:24.181076 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.591609 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.601669 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.728357 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-config-data\") pod \"1d266d46-03f6-4f16-bf8e-fc44da521e64\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.728411 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j294f\" (UniqueName: \"kubernetes.io/projected/1d266d46-03f6-4f16-bf8e-fc44da521e64-kube-api-access-j294f\") pod \"1d266d46-03f6-4f16-bf8e-fc44da521e64\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.728443 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-combined-ca-bundle\") pod \"b6554330-1024-4d85-8b5b-f7f354c9631d\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.728479 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-combined-ca-bundle\") pod \"1d266d46-03f6-4f16-bf8e-fc44da521e64\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.728558 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-config-data\") pod \"b6554330-1024-4d85-8b5b-f7f354c9631d\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.728599 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-427wq\" (UniqueName: \"kubernetes.io/projected/b6554330-1024-4d85-8b5b-f7f354c9631d-kube-api-access-427wq\") pod \"b6554330-1024-4d85-8b5b-f7f354c9631d\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.728696 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-scripts\") pod \"1d266d46-03f6-4f16-bf8e-fc44da521e64\" (UID: \"1d266d46-03f6-4f16-bf8e-fc44da521e64\") " Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.728858 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-scripts\") pod \"b6554330-1024-4d85-8b5b-f7f354c9631d\" (UID: \"b6554330-1024-4d85-8b5b-f7f354c9631d\") " Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.737509 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-scripts" (OuterVolumeSpecName: "scripts") pod "b6554330-1024-4d85-8b5b-f7f354c9631d" (UID: "b6554330-1024-4d85-8b5b-f7f354c9631d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.738843 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6554330-1024-4d85-8b5b-f7f354c9631d-kube-api-access-427wq" (OuterVolumeSpecName: "kube-api-access-427wq") pod "b6554330-1024-4d85-8b5b-f7f354c9631d" (UID: "b6554330-1024-4d85-8b5b-f7f354c9631d"). InnerVolumeSpecName "kube-api-access-427wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.739927 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-scripts" (OuterVolumeSpecName: "scripts") pod "1d266d46-03f6-4f16-bf8e-fc44da521e64" (UID: "1d266d46-03f6-4f16-bf8e-fc44da521e64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.745945 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d266d46-03f6-4f16-bf8e-fc44da521e64-kube-api-access-j294f" (OuterVolumeSpecName: "kube-api-access-j294f") pod "1d266d46-03f6-4f16-bf8e-fc44da521e64" (UID: "1d266d46-03f6-4f16-bf8e-fc44da521e64"). InnerVolumeSpecName "kube-api-access-j294f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.763305 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6554330-1024-4d85-8b5b-f7f354c9631d" (UID: "b6554330-1024-4d85-8b5b-f7f354c9631d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.767106 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d266d46-03f6-4f16-bf8e-fc44da521e64" (UID: "1d266d46-03f6-4f16-bf8e-fc44da521e64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.772899 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-config-data" (OuterVolumeSpecName: "config-data") pod "1d266d46-03f6-4f16-bf8e-fc44da521e64" (UID: "1d266d46-03f6-4f16-bf8e-fc44da521e64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.783591 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-config-data" (OuterVolumeSpecName: "config-data") pod "b6554330-1024-4d85-8b5b-f7f354c9631d" (UID: "b6554330-1024-4d85-8b5b-f7f354c9631d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.833286 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.833356 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j294f\" (UniqueName: \"kubernetes.io/projected/1d266d46-03f6-4f16-bf8e-fc44da521e64-kube-api-access-j294f\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.833373 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.833385 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.833397 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.833411 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-427wq\" (UniqueName: \"kubernetes.io/projected/b6554330-1024-4d85-8b5b-f7f354c9631d-kube-api-access-427wq\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.833438 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d266d46-03f6-4f16-bf8e-fc44da521e64-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:25 crc kubenswrapper[4911]: I0310 14:26:25.833449 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6554330-1024-4d85-8b5b-f7f354c9631d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.077634 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p89dc" event={"ID":"1d266d46-03f6-4f16-bf8e-fc44da521e64","Type":"ContainerDied","Data":"167029b7ad0d70aa730e1928fe63108f82d054992af20c06492675853d6e48cc"} Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.077999 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="167029b7ad0d70aa730e1928fe63108f82d054992af20c06492675853d6e48cc" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.078258 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p89dc" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.080647 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cpd4g" event={"ID":"b6554330-1024-4d85-8b5b-f7f354c9631d","Type":"ContainerDied","Data":"0ee335d3fe51a04a626d5c3d7faf7b325fa721ff7b95edd3ad4b1846f5be2929"} Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.080707 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ee335d3fe51a04a626d5c3d7faf7b325fa721ff7b95edd3ad4b1846f5be2929" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.080926 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cpd4g" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.216769 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 14:26:26 crc kubenswrapper[4911]: E0310 14:26:26.217231 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6554330-1024-4d85-8b5b-f7f354c9631d" containerName="nova-manage" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.217255 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6554330-1024-4d85-8b5b-f7f354c9631d" containerName="nova-manage" Mar 10 14:26:26 crc kubenswrapper[4911]: E0310 14:26:26.217284 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d266d46-03f6-4f16-bf8e-fc44da521e64" containerName="nova-cell1-conductor-db-sync" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.217293 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d266d46-03f6-4f16-bf8e-fc44da521e64" containerName="nova-cell1-conductor-db-sync" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.217543 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6554330-1024-4d85-8b5b-f7f354c9631d" containerName="nova-manage" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.217571 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d266d46-03f6-4f16-bf8e-fc44da521e64" containerName="nova-cell1-conductor-db-sync" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.224879 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.225024 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.229378 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.304744 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.305369 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-log" containerID="cri-o://17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b" gracePeriod=30 Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.305629 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-api" containerID="cri-o://e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b" gracePeriod=30 Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.360788 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.361088 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="56f50b77-e8db-4942-ab68-80013c8bb306" containerName="nova-scheduler-scheduler" containerID="cri-o://8d8bd8f73633bc12932fc53b27db2b40cf833d23604859e2fd83bafa920fadfb" gracePeriod=30 Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.363807 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d81b0082-b7ae-4d38-8dd2-5d20459aa493-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.364000 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81b0082-b7ae-4d38-8dd2-5d20459aa493-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.364054 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltkrn\" (UniqueName: \"kubernetes.io/projected/d81b0082-b7ae-4d38-8dd2-5d20459aa493-kube-api-access-ltkrn\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.375163 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.375448 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="55164988-7848-4980-89d7-bd4c4637b193" containerName="nova-metadata-log" containerID="cri-o://82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e" gracePeriod=30 Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.375588 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="55164988-7848-4980-89d7-bd4c4637b193" containerName="nova-metadata-metadata" containerID="cri-o://dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45" gracePeriod=30 Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.466688 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltkrn\" (UniqueName: \"kubernetes.io/projected/d81b0082-b7ae-4d38-8dd2-5d20459aa493-kube-api-access-ltkrn\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.466911 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d81b0082-b7ae-4d38-8dd2-5d20459aa493-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.467072 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81b0082-b7ae-4d38-8dd2-5d20459aa493-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.473914 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d81b0082-b7ae-4d38-8dd2-5d20459aa493-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.474112 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81b0082-b7ae-4d38-8dd2-5d20459aa493-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.495817 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltkrn\" (UniqueName: \"kubernetes.io/projected/d81b0082-b7ae-4d38-8dd2-5d20459aa493-kube-api-access-ltkrn\") pod \"nova-cell1-conductor-0\" (UID: \"d81b0082-b7ae-4d38-8dd2-5d20459aa493\") " pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.516519 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.516569 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.544632 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:26 crc kubenswrapper[4911]: I0310 14:26:26.941470 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.083938 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55164988-7848-4980-89d7-bd4c4637b193-logs\") pod \"55164988-7848-4980-89d7-bd4c4637b193\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.084009 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-combined-ca-bundle\") pod \"55164988-7848-4980-89d7-bd4c4637b193\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.084067 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-config-data\") pod \"55164988-7848-4980-89d7-bd4c4637b193\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.084281 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkl8d\" (UniqueName: \"kubernetes.io/projected/55164988-7848-4980-89d7-bd4c4637b193-kube-api-access-tkl8d\") pod \"55164988-7848-4980-89d7-bd4c4637b193\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.084351 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-nova-metadata-tls-certs\") pod \"55164988-7848-4980-89d7-bd4c4637b193\" (UID: \"55164988-7848-4980-89d7-bd4c4637b193\") " Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.086120 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55164988-7848-4980-89d7-bd4c4637b193-logs" (OuterVolumeSpecName: "logs") pod "55164988-7848-4980-89d7-bd4c4637b193" (UID: "55164988-7848-4980-89d7-bd4c4637b193"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.095060 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55164988-7848-4980-89d7-bd4c4637b193-kube-api-access-tkl8d" (OuterVolumeSpecName: "kube-api-access-tkl8d") pod "55164988-7848-4980-89d7-bd4c4637b193" (UID: "55164988-7848-4980-89d7-bd4c4637b193"). InnerVolumeSpecName "kube-api-access-tkl8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.097259 4911 generic.go:334] "Generic (PLEG): container finished" podID="55164988-7848-4980-89d7-bd4c4637b193" containerID="dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45" exitCode=0 Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.097304 4911 generic.go:334] "Generic (PLEG): container finished" podID="55164988-7848-4980-89d7-bd4c4637b193" containerID="82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e" exitCode=143 Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.097370 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55164988-7848-4980-89d7-bd4c4637b193","Type":"ContainerDied","Data":"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45"} Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.097413 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55164988-7848-4980-89d7-bd4c4637b193","Type":"ContainerDied","Data":"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e"} Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.097429 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55164988-7848-4980-89d7-bd4c4637b193","Type":"ContainerDied","Data":"fb4635bea76139cde825ed879eb7c3bc39c9fe86826c2d3179beb3db44c40124"} Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.097458 4911 scope.go:117] "RemoveContainer" containerID="dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.097666 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.103596 4911 generic.go:334] "Generic (PLEG): container finished" podID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerID="17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b" exitCode=143 Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.103678 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8144e45e-0b01-4199-ae1f-560e101f7eab","Type":"ContainerDied","Data":"17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b"} Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.106073 4911 generic.go:334] "Generic (PLEG): container finished" podID="56f50b77-e8db-4942-ab68-80013c8bb306" containerID="8d8bd8f73633bc12932fc53b27db2b40cf833d23604859e2fd83bafa920fadfb" exitCode=0 Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.106143 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"56f50b77-e8db-4942-ab68-80013c8bb306","Type":"ContainerDied","Data":"8d8bd8f73633bc12932fc53b27db2b40cf833d23604859e2fd83bafa920fadfb"} Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.131930 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-config-data" (OuterVolumeSpecName: "config-data") pod "55164988-7848-4980-89d7-bd4c4637b193" (UID: "55164988-7848-4980-89d7-bd4c4637b193"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.133268 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55164988-7848-4980-89d7-bd4c4637b193" (UID: "55164988-7848-4980-89d7-bd4c4637b193"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.148349 4911 scope.go:117] "RemoveContainer" containerID="82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.152645 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.158034 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "55164988-7848-4980-89d7-bd4c4637b193" (UID: "55164988-7848-4980-89d7-bd4c4637b193"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:27 crc kubenswrapper[4911]: W0310 14:26:27.159878 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd81b0082_b7ae_4d38_8dd2_5d20459aa493.slice/crio-5643fb01622831b3f113adb0c40f974bb437bdc6818fe9f468f65c13a107f49d WatchSource:0}: Error finding container 5643fb01622831b3f113adb0c40f974bb437bdc6818fe9f468f65c13a107f49d: Status 404 returned error can't find the container with id 5643fb01622831b3f113adb0c40f974bb437bdc6818fe9f468f65c13a107f49d Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.178854 4911 scope.go:117] "RemoveContainer" containerID="dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45" Mar 10 14:26:27 crc kubenswrapper[4911]: E0310 14:26:27.181818 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45\": container with ID starting with dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45 not found: ID does not exist" containerID="dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.182142 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45"} err="failed to get container status \"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45\": rpc error: code = NotFound desc = could not find container \"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45\": container with ID starting with dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45 not found: ID does not exist" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.182167 4911 scope.go:117] "RemoveContainer" containerID="82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e" Mar 10 14:26:27 crc kubenswrapper[4911]: E0310 14:26:27.182610 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e\": container with ID starting with 82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e not found: ID does not exist" containerID="82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.182658 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e"} err="failed to get container status \"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e\": rpc error: code = NotFound desc = could not find container \"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e\": container with ID starting with 82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e not found: ID does not exist" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.182780 4911 scope.go:117] "RemoveContainer" containerID="dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.183134 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45"} err="failed to get container status \"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45\": rpc error: code = NotFound desc = could not find container \"dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45\": container with ID starting with dffdfae8141c4d989ca3c3f0f98ae900fbcd65cf74e11c238cfc71040ea7cc45 not found: ID does not exist" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.183162 4911 scope.go:117] "RemoveContainer" containerID="82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.183525 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e"} err="failed to get container status \"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e\": rpc error: code = NotFound desc = could not find container \"82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e\": container with ID starting with 82dab156889bacebfc399ce00880d4bb31f56574d871cffad19ebf11cf7b973e not found: ID does not exist" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.186111 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkl8d\" (UniqueName: \"kubernetes.io/projected/55164988-7848-4980-89d7-bd4c4637b193-kube-api-access-tkl8d\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.186146 4911 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.186175 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55164988-7848-4980-89d7-bd4c4637b193-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.186186 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.186198 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55164988-7848-4980-89d7-bd4c4637b193-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.537928 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.559149 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.570367 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.588348 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:27 crc kubenswrapper[4911]: E0310 14:26:27.588975 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55164988-7848-4980-89d7-bd4c4637b193" containerName="nova-metadata-log" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.588999 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="55164988-7848-4980-89d7-bd4c4637b193" containerName="nova-metadata-log" Mar 10 14:26:27 crc kubenswrapper[4911]: E0310 14:26:27.589017 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f50b77-e8db-4942-ab68-80013c8bb306" containerName="nova-scheduler-scheduler" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.589025 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f50b77-e8db-4942-ab68-80013c8bb306" containerName="nova-scheduler-scheduler" Mar 10 14:26:27 crc kubenswrapper[4911]: E0310 14:26:27.589065 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55164988-7848-4980-89d7-bd4c4637b193" containerName="nova-metadata-metadata" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.589072 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="55164988-7848-4980-89d7-bd4c4637b193" containerName="nova-metadata-metadata" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.589283 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f50b77-e8db-4942-ab68-80013c8bb306" containerName="nova-scheduler-scheduler" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.589352 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="55164988-7848-4980-89d7-bd4c4637b193" containerName="nova-metadata-metadata" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.589366 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="55164988-7848-4980-89d7-bd4c4637b193" containerName="nova-metadata-log" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.590587 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.594292 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-config-data\") pod \"56f50b77-e8db-4942-ab68-80013c8bb306\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.594475 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-combined-ca-bundle\") pod \"56f50b77-e8db-4942-ab68-80013c8bb306\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.594699 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwngj\" (UniqueName: \"kubernetes.io/projected/56f50b77-e8db-4942-ab68-80013c8bb306-kube-api-access-nwngj\") pod \"56f50b77-e8db-4942-ab68-80013c8bb306\" (UID: \"56f50b77-e8db-4942-ab68-80013c8bb306\") " Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.599783 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.600034 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.605873 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f50b77-e8db-4942-ab68-80013c8bb306-kube-api-access-nwngj" (OuterVolumeSpecName: "kube-api-access-nwngj") pod "56f50b77-e8db-4942-ab68-80013c8bb306" (UID: "56f50b77-e8db-4942-ab68-80013c8bb306"). InnerVolumeSpecName "kube-api-access-nwngj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.616220 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.624886 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56f50b77-e8db-4942-ab68-80013c8bb306" (UID: "56f50b77-e8db-4942-ab68-80013c8bb306"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.642755 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-config-data" (OuterVolumeSpecName: "config-data") pod "56f50b77-e8db-4942-ab68-80013c8bb306" (UID: "56f50b77-e8db-4942-ab68-80013c8bb306"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.697577 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.697691 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.697784 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-config-data\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.697851 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qxk7\" (UniqueName: \"kubernetes.io/projected/c0507925-e9be-40d0-96ad-2a99166d3674-kube-api-access-8qxk7\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.697880 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0507925-e9be-40d0-96ad-2a99166d3674-logs\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.697992 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.698014 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f50b77-e8db-4942-ab68-80013c8bb306-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.698028 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwngj\" (UniqueName: \"kubernetes.io/projected/56f50b77-e8db-4942-ab68-80013c8bb306-kube-api-access-nwngj\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.800359 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.800463 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-config-data\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.800526 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qxk7\" (UniqueName: \"kubernetes.io/projected/c0507925-e9be-40d0-96ad-2a99166d3674-kube-api-access-8qxk7\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.800553 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0507925-e9be-40d0-96ad-2a99166d3674-logs\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.800617 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.801342 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0507925-e9be-40d0-96ad-2a99166d3674-logs\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.807215 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.807850 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-config-data\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.808617 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:27 crc kubenswrapper[4911]: I0310 14:26:27.822530 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qxk7\" (UniqueName: \"kubernetes.io/projected/c0507925-e9be-40d0-96ad-2a99166d3674-kube-api-access-8qxk7\") pod \"nova-metadata-0\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " pod="openstack/nova-metadata-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.010718 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.176511 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d81b0082-b7ae-4d38-8dd2-5d20459aa493","Type":"ContainerStarted","Data":"2d2d1b616274407249b397f8cc00ac3542e3a7c8c84d0e4497a3cffb13684dc0"} Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.176592 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d81b0082-b7ae-4d38-8dd2-5d20459aa493","Type":"ContainerStarted","Data":"5643fb01622831b3f113adb0c40f974bb437bdc6818fe9f468f65c13a107f49d"} Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.179113 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.181022 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"56f50b77-e8db-4942-ab68-80013c8bb306","Type":"ContainerDied","Data":"28498e101a02c0667011b7fcabf0bf9c61d99480693898207a97a94d2f9f28eb"} Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.181068 4911 scope.go:117] "RemoveContainer" containerID="8d8bd8f73633bc12932fc53b27db2b40cf833d23604859e2fd83bafa920fadfb" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.181194 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.221606 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55164988-7848-4980-89d7-bd4c4637b193" path="/var/lib/kubelet/pods/55164988-7848-4980-89d7-bd4c4637b193/volumes" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.233280 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.233233966 podStartE2EDuration="2.233233966s" podCreationTimestamp="2026-03-10 14:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:28.198869185 +0000 UTC m=+1492.762389102" watchObservedRunningTime="2026-03-10 14:26:28.233233966 +0000 UTC m=+1492.796753883" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.265033 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.286798 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.300861 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.318325 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.320997 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.325356 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.383888 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.420626 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.420830 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9zl9\" (UniqueName: \"kubernetes.io/projected/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-kube-api-access-c9zl9\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.420921 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-config-data\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.452386 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2hkbh"] Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.452760 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" podUID="5322d280-17b4-489e-a867-6527ce33d3f4" containerName="dnsmasq-dns" containerID="cri-o://5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de" gracePeriod=10 Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.522710 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.523012 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9zl9\" (UniqueName: \"kubernetes.io/projected/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-kube-api-access-c9zl9\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.523087 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-config-data\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.531694 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-config-data\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.534371 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.547019 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9zl9\" (UniqueName: \"kubernetes.io/projected/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-kube-api-access-c9zl9\") pod \"nova-scheduler-0\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.550912 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:26:28 crc kubenswrapper[4911]: W0310 14:26:28.567402 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0507925_e9be_40d0_96ad_2a99166d3674.slice/crio-6977bf98c8350ee451e34cf0cdb224cfccb6f95280995302e69396f65f4addd8 WatchSource:0}: Error finding container 6977bf98c8350ee451e34cf0cdb224cfccb6f95280995302e69396f65f4addd8: Status 404 returned error can't find the container with id 6977bf98c8350ee451e34cf0cdb224cfccb6f95280995302e69396f65f4addd8 Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.656693 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:26:28 crc kubenswrapper[4911]: I0310 14:26:28.962511 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.035355 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-swift-storage-0\") pod \"5322d280-17b4-489e-a867-6527ce33d3f4\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.035526 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-config\") pod \"5322d280-17b4-489e-a867-6527ce33d3f4\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.035568 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-nb\") pod \"5322d280-17b4-489e-a867-6527ce33d3f4\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.035718 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4nrd\" (UniqueName: \"kubernetes.io/projected/5322d280-17b4-489e-a867-6527ce33d3f4-kube-api-access-z4nrd\") pod \"5322d280-17b4-489e-a867-6527ce33d3f4\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.035852 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-svc\") pod \"5322d280-17b4-489e-a867-6527ce33d3f4\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.035967 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-sb\") pod \"5322d280-17b4-489e-a867-6527ce33d3f4\" (UID: \"5322d280-17b4-489e-a867-6527ce33d3f4\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.052510 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5322d280-17b4-489e-a867-6527ce33d3f4-kube-api-access-z4nrd" (OuterVolumeSpecName: "kube-api-access-z4nrd") pod "5322d280-17b4-489e-a867-6527ce33d3f4" (UID: "5322d280-17b4-489e-a867-6527ce33d3f4"). InnerVolumeSpecName "kube-api-access-z4nrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.115022 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5322d280-17b4-489e-a867-6527ce33d3f4" (UID: "5322d280-17b4-489e-a867-6527ce33d3f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.138751 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4nrd\" (UniqueName: \"kubernetes.io/projected/5322d280-17b4-489e-a867-6527ce33d3f4-kube-api-access-z4nrd\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.138780 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.167991 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5322d280-17b4-489e-a867-6527ce33d3f4" (UID: "5322d280-17b4-489e-a867-6527ce33d3f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.175790 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5322d280-17b4-489e-a867-6527ce33d3f4" (UID: "5322d280-17b4-489e-a867-6527ce33d3f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.205282 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.209443 4911 generic.go:334] "Generic (PLEG): container finished" podID="5322d280-17b4-489e-a867-6527ce33d3f4" containerID="5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de" exitCode=0 Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.209517 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.209578 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" event={"ID":"5322d280-17b4-489e-a867-6527ce33d3f4","Type":"ContainerDied","Data":"5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de"} Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.209642 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2hkbh" event={"ID":"5322d280-17b4-489e-a867-6527ce33d3f4","Type":"ContainerDied","Data":"32a4d954f24cac78ede323cd03a265b33fdca4880152fdfc1c8b0ec5808bc0ea"} Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.209688 4911 scope.go:117] "RemoveContainer" containerID="5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.224706 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5322d280-17b4-489e-a867-6527ce33d3f4" (UID: "5322d280-17b4-489e-a867-6527ce33d3f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.225813 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-config" (OuterVolumeSpecName: "config") pod "5322d280-17b4-489e-a867-6527ce33d3f4" (UID: "5322d280-17b4-489e-a867-6527ce33d3f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.226436 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0507925-e9be-40d0-96ad-2a99166d3674","Type":"ContainerStarted","Data":"1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55"} Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.226522 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0507925-e9be-40d0-96ad-2a99166d3674","Type":"ContainerStarted","Data":"6977bf98c8350ee451e34cf0cdb224cfccb6f95280995302e69396f65f4addd8"} Mar 10 14:26:29 crc kubenswrapper[4911]: W0310 14:26:29.229026 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3430ae8c_7569_4d15_88cf_84ff4d6cc01a.slice/crio-791bb90ca049dd1dc502b37c7a5f1e0563d9e76c39788230acf4fd6b65b3bff5 WatchSource:0}: Error finding container 791bb90ca049dd1dc502b37c7a5f1e0563d9e76c39788230acf4fd6b65b3bff5: Status 404 returned error can't find the container with id 791bb90ca049dd1dc502b37c7a5f1e0563d9e76c39788230acf4fd6b65b3bff5 Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.241741 4911 scope.go:117] "RemoveContainer" containerID="8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.244850 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.244937 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.244961 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.244981 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5322d280-17b4-489e-a867-6527ce33d3f4-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.274394 4911 scope.go:117] "RemoveContainer" containerID="5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de" Mar 10 14:26:29 crc kubenswrapper[4911]: E0310 14:26:29.282914 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de\": container with ID starting with 5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de not found: ID does not exist" containerID="5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.282964 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de"} err="failed to get container status \"5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de\": rpc error: code = NotFound desc = could not find container \"5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de\": container with ID starting with 5492feefa0ec90c9b5f25abd2e8fdc896ab8802cd35623ef20c536c506e407de not found: ID does not exist" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.282997 4911 scope.go:117] "RemoveContainer" containerID="8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c" Mar 10 14:26:29 crc kubenswrapper[4911]: E0310 14:26:29.283650 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c\": container with ID starting with 8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c not found: ID does not exist" containerID="8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.283703 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c"} err="failed to get container status \"8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c\": rpc error: code = NotFound desc = could not find container \"8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c\": container with ID starting with 8fbd76ecf66748b85aca04965039f073d8dd902d4cab64abddcd45981ee4b06c not found: ID does not exist" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.628698 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2hkbh"] Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.662018 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2hkbh"] Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.927828 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.966029 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-combined-ca-bundle\") pod \"8144e45e-0b01-4199-ae1f-560e101f7eab\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.966179 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8144e45e-0b01-4199-ae1f-560e101f7eab-logs\") pod \"8144e45e-0b01-4199-ae1f-560e101f7eab\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.966287 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-config-data\") pod \"8144e45e-0b01-4199-ae1f-560e101f7eab\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.966393 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfjn4\" (UniqueName: \"kubernetes.io/projected/8144e45e-0b01-4199-ae1f-560e101f7eab-kube-api-access-nfjn4\") pod \"8144e45e-0b01-4199-ae1f-560e101f7eab\" (UID: \"8144e45e-0b01-4199-ae1f-560e101f7eab\") " Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.967598 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8144e45e-0b01-4199-ae1f-560e101f7eab-logs" (OuterVolumeSpecName: "logs") pod "8144e45e-0b01-4199-ae1f-560e101f7eab" (UID: "8144e45e-0b01-4199-ae1f-560e101f7eab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:26:29 crc kubenswrapper[4911]: I0310 14:26:29.983050 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8144e45e-0b01-4199-ae1f-560e101f7eab-kube-api-access-nfjn4" (OuterVolumeSpecName: "kube-api-access-nfjn4") pod "8144e45e-0b01-4199-ae1f-560e101f7eab" (UID: "8144e45e-0b01-4199-ae1f-560e101f7eab"). InnerVolumeSpecName "kube-api-access-nfjn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.004639 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-config-data" (OuterVolumeSpecName: "config-data") pod "8144e45e-0b01-4199-ae1f-560e101f7eab" (UID: "8144e45e-0b01-4199-ae1f-560e101f7eab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.018952 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8144e45e-0b01-4199-ae1f-560e101f7eab" (UID: "8144e45e-0b01-4199-ae1f-560e101f7eab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.071581 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8144e45e-0b01-4199-ae1f-560e101f7eab-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.071654 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.071673 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfjn4\" (UniqueName: \"kubernetes.io/projected/8144e45e-0b01-4199-ae1f-560e101f7eab-kube-api-access-nfjn4\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.071695 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8144e45e-0b01-4199-ae1f-560e101f7eab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.206632 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5322d280-17b4-489e-a867-6527ce33d3f4" path="/var/lib/kubelet/pods/5322d280-17b4-489e-a867-6527ce33d3f4/volumes" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.207304 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f50b77-e8db-4942-ab68-80013c8bb306" path="/var/lib/kubelet/pods/56f50b77-e8db-4942-ab68-80013c8bb306/volumes" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.241333 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0507925-e9be-40d0-96ad-2a99166d3674","Type":"ContainerStarted","Data":"cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4"} Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.249067 4911 generic.go:334] "Generic (PLEG): container finished" podID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerID="e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b" exitCode=0 Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.249136 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8144e45e-0b01-4199-ae1f-560e101f7eab","Type":"ContainerDied","Data":"e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b"} Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.249227 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8144e45e-0b01-4199-ae1f-560e101f7eab","Type":"ContainerDied","Data":"e9166f03a87c1d3c22203f24c46034a2838abc50d0dae740ed24d70fe002b2e1"} Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.249250 4911 scope.go:117] "RemoveContainer" containerID="e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.249135 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.253145 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3430ae8c-7569-4d15-88cf-84ff4d6cc01a","Type":"ContainerStarted","Data":"63f75d5fcd57d05aabd0fb36bdd533eb30af6466e4266f45b52280767613c897"} Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.253204 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3430ae8c-7569-4d15-88cf-84ff4d6cc01a","Type":"ContainerStarted","Data":"791bb90ca049dd1dc502b37c7a5f1e0563d9e76c39788230acf4fd6b65b3bff5"} Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.283364 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.283336362 podStartE2EDuration="3.283336362s" podCreationTimestamp="2026-03-10 14:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:30.2624984 +0000 UTC m=+1494.826018337" watchObservedRunningTime="2026-03-10 14:26:30.283336362 +0000 UTC m=+1494.846856289" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.288867 4911 scope.go:117] "RemoveContainer" containerID="17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.318839 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.318810183 podStartE2EDuration="2.318810183s" podCreationTimestamp="2026-03-10 14:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:30.289404223 +0000 UTC m=+1494.852924140" watchObservedRunningTime="2026-03-10 14:26:30.318810183 +0000 UTC m=+1494.882330100" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.328230 4911 scope.go:117] "RemoveContainer" containerID="e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b" Mar 10 14:26:30 crc kubenswrapper[4911]: E0310 14:26:30.331006 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b\": container with ID starting with e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b not found: ID does not exist" containerID="e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.331250 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b"} err="failed to get container status \"e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b\": rpc error: code = NotFound desc = could not find container \"e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b\": container with ID starting with e1a641c4328929514c9daae806f18345cc80665c9cc3ea4925f95d8a4cba703b not found: ID does not exist" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.331353 4911 scope.go:117] "RemoveContainer" containerID="17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b" Mar 10 14:26:30 crc kubenswrapper[4911]: E0310 14:26:30.334946 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b\": container with ID starting with 17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b not found: ID does not exist" containerID="17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.335019 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b"} err="failed to get container status \"17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b\": rpc error: code = NotFound desc = could not find container \"17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b\": container with ID starting with 17d42ab61d6428d67df02d094e7f796d555469018fb68f1576f7ac944b8e868b not found: ID does not exist" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.337009 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.355816 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.378297 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:30 crc kubenswrapper[4911]: E0310 14:26:30.379052 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5322d280-17b4-489e-a867-6527ce33d3f4" containerName="init" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.379077 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5322d280-17b4-489e-a867-6527ce33d3f4" containerName="init" Mar 10 14:26:30 crc kubenswrapper[4911]: E0310 14:26:30.379111 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5322d280-17b4-489e-a867-6527ce33d3f4" containerName="dnsmasq-dns" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.379121 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5322d280-17b4-489e-a867-6527ce33d3f4" containerName="dnsmasq-dns" Mar 10 14:26:30 crc kubenswrapper[4911]: E0310 14:26:30.379134 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-api" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.379142 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-api" Mar 10 14:26:30 crc kubenswrapper[4911]: E0310 14:26:30.379163 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-log" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.379172 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-log" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.379406 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-log" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.379430 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" containerName="nova-api-api" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.379457 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5322d280-17b4-489e-a867-6527ce33d3f4" containerName="dnsmasq-dns" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.381234 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.392041 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.402795 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.483251 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.483361 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzkck\" (UniqueName: \"kubernetes.io/projected/d29545ce-9c60-44b8-8c4f-54b490100dea-kube-api-access-qzkck\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.483418 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-config-data\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.483533 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d29545ce-9c60-44b8-8c4f-54b490100dea-logs\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.588115 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-config-data\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.588230 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d29545ce-9c60-44b8-8c4f-54b490100dea-logs\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.588850 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d29545ce-9c60-44b8-8c4f-54b490100dea-logs\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.588987 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.589321 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzkck\" (UniqueName: \"kubernetes.io/projected/d29545ce-9c60-44b8-8c4f-54b490100dea-kube-api-access-qzkck\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.603498 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.607136 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-config-data\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.614861 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzkck\" (UniqueName: \"kubernetes.io/projected/d29545ce-9c60-44b8-8c4f-54b490100dea-kube-api-access-qzkck\") pod \"nova-api-0\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " pod="openstack/nova-api-0" Mar 10 14:26:30 crc kubenswrapper[4911]: I0310 14:26:30.754768 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:31 crc kubenswrapper[4911]: I0310 14:26:31.981340 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:32 crc kubenswrapper[4911]: I0310 14:26:32.225838 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8144e45e-0b01-4199-ae1f-560e101f7eab" path="/var/lib/kubelet/pods/8144e45e-0b01-4199-ae1f-560e101f7eab/volumes" Mar 10 14:26:32 crc kubenswrapper[4911]: I0310 14:26:32.276580 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d29545ce-9c60-44b8-8c4f-54b490100dea","Type":"ContainerStarted","Data":"31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04"} Mar 10 14:26:32 crc kubenswrapper[4911]: I0310 14:26:32.276641 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d29545ce-9c60-44b8-8c4f-54b490100dea","Type":"ContainerStarted","Data":"5e489bf303f817431fb829fe7cd7a61e7dd53335fc64c14c15bff628360a2a17"} Mar 10 14:26:33 crc kubenswrapper[4911]: I0310 14:26:33.011056 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 14:26:33 crc kubenswrapper[4911]: I0310 14:26:33.011396 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 14:26:33 crc kubenswrapper[4911]: I0310 14:26:33.291019 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d29545ce-9c60-44b8-8c4f-54b490100dea","Type":"ContainerStarted","Data":"7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110"} Mar 10 14:26:33 crc kubenswrapper[4911]: I0310 14:26:33.320168 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.320148038 podStartE2EDuration="3.320148038s" podCreationTimestamp="2026-03-10 14:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:33.315255008 +0000 UTC m=+1497.878774945" watchObservedRunningTime="2026-03-10 14:26:33.320148038 +0000 UTC m=+1497.883667955" Mar 10 14:26:33 crc kubenswrapper[4911]: I0310 14:26:33.656945 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 14:26:36 crc kubenswrapper[4911]: I0310 14:26:36.589181 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 14:26:38 crc kubenswrapper[4911]: I0310 14:26:38.011260 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 14:26:38 crc kubenswrapper[4911]: I0310 14:26:38.011600 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 14:26:38 crc kubenswrapper[4911]: I0310 14:26:38.657645 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 14:26:38 crc kubenswrapper[4911]: I0310 14:26:38.697062 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 14:26:39 crc kubenswrapper[4911]: I0310 14:26:39.024971 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 14:26:39 crc kubenswrapper[4911]: I0310 14:26:39.024971 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 14:26:39 crc kubenswrapper[4911]: I0310 14:26:39.374147 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 14:26:40 crc kubenswrapper[4911]: I0310 14:26:40.755096 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 14:26:40 crc kubenswrapper[4911]: I0310 14:26:40.755170 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 14:26:41 crc kubenswrapper[4911]: I0310 14:26:41.843225 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.206:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 14:26:41 crc kubenswrapper[4911]: I0310 14:26:41.843193 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.206:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 14:26:43 crc kubenswrapper[4911]: I0310 14:26:43.075759 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 14:26:48 crc kubenswrapper[4911]: I0310 14:26:48.018098 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 14:26:48 crc kubenswrapper[4911]: I0310 14:26:48.018775 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 14:26:48 crc kubenswrapper[4911]: I0310 14:26:48.026900 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 14:26:48 crc kubenswrapper[4911]: I0310 14:26:48.027366 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.400777 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.473813 4911 generic.go:334] "Generic (PLEG): container finished" podID="ba983b1b-2e11-4bc2-bb85-0e10ec390df9" containerID="24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02" exitCode=137 Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.473864 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ba983b1b-2e11-4bc2-bb85-0e10ec390df9","Type":"ContainerDied","Data":"24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02"} Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.473896 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ba983b1b-2e11-4bc2-bb85-0e10ec390df9","Type":"ContainerDied","Data":"ba912882ee742c372bc8f24288b0caf88010a944a794313bdb9c2ae67afe841a"} Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.473916 4911 scope.go:117] "RemoveContainer" containerID="24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.473870 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.500956 4911 scope.go:117] "RemoveContainer" containerID="24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02" Mar 10 14:26:50 crc kubenswrapper[4911]: E0310 14:26:50.501717 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02\": container with ID starting with 24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02 not found: ID does not exist" containerID="24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.501873 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02"} err="failed to get container status \"24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02\": rpc error: code = NotFound desc = could not find container \"24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02\": container with ID starting with 24785a32f42e0ad3f4bef7111fd8a2c60e217a4729d4771b60c4526588d79b02 not found: ID does not exist" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.528154 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-config-data\") pod \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.528257 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc6xt\" (UniqueName: \"kubernetes.io/projected/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-kube-api-access-mc6xt\") pod \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.528303 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-combined-ca-bundle\") pod \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\" (UID: \"ba983b1b-2e11-4bc2-bb85-0e10ec390df9\") " Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.535747 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-kube-api-access-mc6xt" (OuterVolumeSpecName: "kube-api-access-mc6xt") pod "ba983b1b-2e11-4bc2-bb85-0e10ec390df9" (UID: "ba983b1b-2e11-4bc2-bb85-0e10ec390df9"). InnerVolumeSpecName "kube-api-access-mc6xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.559672 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-config-data" (OuterVolumeSpecName: "config-data") pod "ba983b1b-2e11-4bc2-bb85-0e10ec390df9" (UID: "ba983b1b-2e11-4bc2-bb85-0e10ec390df9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.560766 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba983b1b-2e11-4bc2-bb85-0e10ec390df9" (UID: "ba983b1b-2e11-4bc2-bb85-0e10ec390df9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.630499 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.630545 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc6xt\" (UniqueName: \"kubernetes.io/projected/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-kube-api-access-mc6xt\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.630561 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba983b1b-2e11-4bc2-bb85-0e10ec390df9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.760115 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.760572 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.761392 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.762915 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.837238 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.857557 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.874437 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:50 crc kubenswrapper[4911]: E0310 14:26:50.875062 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba983b1b-2e11-4bc2-bb85-0e10ec390df9" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.875085 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba983b1b-2e11-4bc2-bb85-0e10ec390df9" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.875330 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba983b1b-2e11-4bc2-bb85-0e10ec390df9" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.876196 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.878850 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.879246 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.879485 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 14:26:50 crc kubenswrapper[4911]: I0310 14:26:50.890749 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.038644 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.038699 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.038781 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.038807 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bzc6\" (UniqueName: \"kubernetes.io/projected/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-kube-api-access-8bzc6\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.038889 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.140809 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.141127 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.141205 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.141242 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bzc6\" (UniqueName: \"kubernetes.io/projected/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-kube-api-access-8bzc6\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.141315 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.145668 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.146022 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.146622 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.158460 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.161188 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bzc6\" (UniqueName: \"kubernetes.io/projected/3b68c0d5-c7e3-4b1d-b9a0-337f56619c45-kube-api-access-8bzc6\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.204404 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.486930 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.491615 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.685234 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4rtt8"] Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.688376 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.705510 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4rtt8"] Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.729580 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.867317 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.867408 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.867450 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.867520 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-config\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.868001 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6drz\" (UniqueName: \"kubernetes.io/projected/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-kube-api-access-k6drz\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.868445 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.971170 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6drz\" (UniqueName: \"kubernetes.io/projected/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-kube-api-access-k6drz\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.971621 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.971672 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.971718 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.971792 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.971831 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-config\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.973608 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.974229 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.975051 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.976194 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.978526 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-config\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:51 crc kubenswrapper[4911]: I0310 14:26:51.990798 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6drz\" (UniqueName: \"kubernetes.io/projected/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-kube-api-access-k6drz\") pod \"dnsmasq-dns-89c5cd4d5-4rtt8\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:52 crc kubenswrapper[4911]: I0310 14:26:52.011530 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:52 crc kubenswrapper[4911]: I0310 14:26:52.084431 4911 scope.go:117] "RemoveContainer" containerID="79abc93edbe6f6458ffb43229595a67741637c5a8b2ad8aeeaf74ee842553c06" Mar 10 14:26:52 crc kubenswrapper[4911]: I0310 14:26:52.217833 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba983b1b-2e11-4bc2-bb85-0e10ec390df9" path="/var/lib/kubelet/pods/ba983b1b-2e11-4bc2-bb85-0e10ec390df9/volumes" Mar 10 14:26:52 crc kubenswrapper[4911]: I0310 14:26:52.505297 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45","Type":"ContainerStarted","Data":"5ba2edbb5fac77544d30d6d1e87027fe467a1b53c4066ea472b3ba5c0ef47c5d"} Mar 10 14:26:52 crc kubenswrapper[4911]: I0310 14:26:52.505756 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3b68c0d5-c7e3-4b1d-b9a0-337f56619c45","Type":"ContainerStarted","Data":"a733dcc44630aaaf182f67560b0032ae7358a8277e38f8f99573159371d60c1e"} Mar 10 14:26:52 crc kubenswrapper[4911]: I0310 14:26:52.509420 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4rtt8"] Mar 10 14:26:52 crc kubenswrapper[4911]: I0310 14:26:52.537657 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.537632399 podStartE2EDuration="2.537632399s" podCreationTimestamp="2026-03-10 14:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:52.522037056 +0000 UTC m=+1517.085556973" watchObservedRunningTime="2026-03-10 14:26:52.537632399 +0000 UTC m=+1517.101152316" Mar 10 14:26:53 crc kubenswrapper[4911]: I0310 14:26:53.516597 4911 generic.go:334] "Generic (PLEG): container finished" podID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" containerID="1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d" exitCode=0 Mar 10 14:26:53 crc kubenswrapper[4911]: I0310 14:26:53.516701 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" event={"ID":"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6","Type":"ContainerDied","Data":"1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d"} Mar 10 14:26:53 crc kubenswrapper[4911]: I0310 14:26:53.517166 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" event={"ID":"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6","Type":"ContainerStarted","Data":"af31946cff891d8791b97294696719dcaaf6c580c82d93d5ff2c56b61e40cc21"} Mar 10 14:26:53 crc kubenswrapper[4911]: I0310 14:26:53.972484 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:53 crc kubenswrapper[4911]: I0310 14:26:53.973297 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="sg-core" containerID="cri-o://0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad" gracePeriod=30 Mar 10 14:26:53 crc kubenswrapper[4911]: I0310 14:26:53.973373 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="ceilometer-notification-agent" containerID="cri-o://b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f" gracePeriod=30 Mar 10 14:26:53 crc kubenswrapper[4911]: I0310 14:26:53.973486 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="proxy-httpd" containerID="cri-o://6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4" gracePeriod=30 Mar 10 14:26:53 crc kubenswrapper[4911]: I0310 14:26:53.973832 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="ceilometer-central-agent" containerID="cri-o://5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83" gracePeriod=30 Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.356573 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.537140 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" event={"ID":"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6","Type":"ContainerStarted","Data":"d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75"} Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.537450 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.540264 4911 generic.go:334] "Generic (PLEG): container finished" podID="067898a3-878b-4cf6-8780-2e6692f31765" containerID="6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4" exitCode=0 Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.540286 4911 generic.go:334] "Generic (PLEG): container finished" podID="067898a3-878b-4cf6-8780-2e6692f31765" containerID="0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad" exitCode=2 Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.540298 4911 generic.go:334] "Generic (PLEG): container finished" podID="067898a3-878b-4cf6-8780-2e6692f31765" containerID="5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83" exitCode=0 Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.540497 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-log" containerID="cri-o://31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04" gracePeriod=30 Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.540711 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerDied","Data":"6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4"} Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.540772 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerDied","Data":"0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad"} Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.540789 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerDied","Data":"5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83"} Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.540877 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-api" containerID="cri-o://7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110" gracePeriod=30 Mar 10 14:26:54 crc kubenswrapper[4911]: I0310 14:26:54.560054 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" podStartSLOduration=3.5600271489999997 podStartE2EDuration="3.560027149s" podCreationTimestamp="2026-03-10 14:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:26:54.556057254 +0000 UTC m=+1519.119577171" watchObservedRunningTime="2026-03-10 14:26:54.560027149 +0000 UTC m=+1519.123547066" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.124185 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.251432 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-scripts\") pod \"067898a3-878b-4cf6-8780-2e6692f31765\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.251621 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-config-data\") pod \"067898a3-878b-4cf6-8780-2e6692f31765\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.251663 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-ceilometer-tls-certs\") pod \"067898a3-878b-4cf6-8780-2e6692f31765\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.251692 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwpql\" (UniqueName: \"kubernetes.io/projected/067898a3-878b-4cf6-8780-2e6692f31765-kube-api-access-nwpql\") pod \"067898a3-878b-4cf6-8780-2e6692f31765\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.251713 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-run-httpd\") pod \"067898a3-878b-4cf6-8780-2e6692f31765\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.251912 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-sg-core-conf-yaml\") pod \"067898a3-878b-4cf6-8780-2e6692f31765\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.251993 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-log-httpd\") pod \"067898a3-878b-4cf6-8780-2e6692f31765\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.252033 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-combined-ca-bundle\") pod \"067898a3-878b-4cf6-8780-2e6692f31765\" (UID: \"067898a3-878b-4cf6-8780-2e6692f31765\") " Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.256439 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "067898a3-878b-4cf6-8780-2e6692f31765" (UID: "067898a3-878b-4cf6-8780-2e6692f31765"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.256748 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "067898a3-878b-4cf6-8780-2e6692f31765" (UID: "067898a3-878b-4cf6-8780-2e6692f31765"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.263187 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-scripts" (OuterVolumeSpecName: "scripts") pod "067898a3-878b-4cf6-8780-2e6692f31765" (UID: "067898a3-878b-4cf6-8780-2e6692f31765"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.263463 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067898a3-878b-4cf6-8780-2e6692f31765-kube-api-access-nwpql" (OuterVolumeSpecName: "kube-api-access-nwpql") pod "067898a3-878b-4cf6-8780-2e6692f31765" (UID: "067898a3-878b-4cf6-8780-2e6692f31765"). InnerVolumeSpecName "kube-api-access-nwpql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.301991 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "067898a3-878b-4cf6-8780-2e6692f31765" (UID: "067898a3-878b-4cf6-8780-2e6692f31765"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.330777 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "067898a3-878b-4cf6-8780-2e6692f31765" (UID: "067898a3-878b-4cf6-8780-2e6692f31765"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.355686 4911 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.355751 4911 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.355765 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.355776 4911 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.355791 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwpql\" (UniqueName: \"kubernetes.io/projected/067898a3-878b-4cf6-8780-2e6692f31765-kube-api-access-nwpql\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.355804 4911 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067898a3-878b-4cf6-8780-2e6692f31765-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.368087 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-config-data" (OuterVolumeSpecName: "config-data") pod "067898a3-878b-4cf6-8780-2e6692f31765" (UID: "067898a3-878b-4cf6-8780-2e6692f31765"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.378195 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "067898a3-878b-4cf6-8780-2e6692f31765" (UID: "067898a3-878b-4cf6-8780-2e6692f31765"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.457634 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.457927 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067898a3-878b-4cf6-8780-2e6692f31765-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.555304 4911 generic.go:334] "Generic (PLEG): container finished" podID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerID="31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04" exitCode=143 Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.555405 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d29545ce-9c60-44b8-8c4f-54b490100dea","Type":"ContainerDied","Data":"31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04"} Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.558837 4911 generic.go:334] "Generic (PLEG): container finished" podID="067898a3-878b-4cf6-8780-2e6692f31765" containerID="b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f" exitCode=0 Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.559434 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.559558 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerDied","Data":"b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f"} Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.559596 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067898a3-878b-4cf6-8780-2e6692f31765","Type":"ContainerDied","Data":"e2ffa80d39f34f1f07d3be320c2aed47c8398d877094584ca12d3c8153459193"} Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.559616 4911 scope.go:117] "RemoveContainer" containerID="6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.629798 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.631853 4911 scope.go:117] "RemoveContainer" containerID="0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.643409 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.662090 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:55 crc kubenswrapper[4911]: E0310 14:26:55.662763 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="ceilometer-notification-agent" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.662794 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="ceilometer-notification-agent" Mar 10 14:26:55 crc kubenswrapper[4911]: E0310 14:26:55.662815 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="proxy-httpd" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.662831 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="proxy-httpd" Mar 10 14:26:55 crc kubenswrapper[4911]: E0310 14:26:55.662860 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="sg-core" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.662870 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="sg-core" Mar 10 14:26:55 crc kubenswrapper[4911]: E0310 14:26:55.662894 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="ceilometer-central-agent" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.662903 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="ceilometer-central-agent" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.663099 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="ceilometer-notification-agent" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.663117 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="sg-core" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.663128 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="ceilometer-central-agent" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.663141 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="067898a3-878b-4cf6-8780-2e6692f31765" containerName="proxy-httpd" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.665399 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.668498 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.668740 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.668853 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.668758 4911 scope.go:117] "RemoveContainer" containerID="b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.696782 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.726161 4911 scope.go:117] "RemoveContainer" containerID="5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.746060 4911 scope.go:117] "RemoveContainer" containerID="6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4" Mar 10 14:26:55 crc kubenswrapper[4911]: E0310 14:26:55.746519 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4\": container with ID starting with 6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4 not found: ID does not exist" containerID="6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.746576 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4"} err="failed to get container status \"6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4\": rpc error: code = NotFound desc = could not find container \"6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4\": container with ID starting with 6f0f9f8cc835cb457edec1112df043ab4e5957511db279ca100577550ebd48a4 not found: ID does not exist" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.746615 4911 scope.go:117] "RemoveContainer" containerID="0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad" Mar 10 14:26:55 crc kubenswrapper[4911]: E0310 14:26:55.746887 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad\": container with ID starting with 0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad not found: ID does not exist" containerID="0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.746913 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad"} err="failed to get container status \"0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad\": rpc error: code = NotFound desc = could not find container \"0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad\": container with ID starting with 0055ee5c8d88626eaef86f00d1c119e0d15b3ebc2a196da1c693a840846390ad not found: ID does not exist" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.746926 4911 scope.go:117] "RemoveContainer" containerID="b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f" Mar 10 14:26:55 crc kubenswrapper[4911]: E0310 14:26:55.750141 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f\": container with ID starting with b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f not found: ID does not exist" containerID="b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.750174 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f"} err="failed to get container status \"b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f\": rpc error: code = NotFound desc = could not find container \"b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f\": container with ID starting with b906ca4e132b6ad2b18a852fb3e3b15250abbb44ca351971caa79f0a8a03466f not found: ID does not exist" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.750193 4911 scope.go:117] "RemoveContainer" containerID="5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83" Mar 10 14:26:55 crc kubenswrapper[4911]: E0310 14:26:55.750512 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83\": container with ID starting with 5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83 not found: ID does not exist" containerID="5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.750558 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83"} err="failed to get container status \"5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83\": rpc error: code = NotFound desc = could not find container \"5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83\": container with ID starting with 5d6c3198025b9a9cdb182752ee884995a074a205127c331e5ce3afd48cd3ec83 not found: ID does not exist" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.765451 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.765620 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-run-httpd\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.765745 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-log-httpd\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.765812 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-scripts\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.765839 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-config-data\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.765877 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.765908 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjpp7\" (UniqueName: \"kubernetes.io/projected/1a8ee675-335d-420a-824e-660b4e4f8b98-kube-api-access-xjpp7\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.765934 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.869741 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-log-httpd\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.870037 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-log-httpd\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.870088 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-scripts\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.870120 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-config-data\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.870815 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.870903 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjpp7\" (UniqueName: \"kubernetes.io/projected/1a8ee675-335d-420a-824e-660b4e4f8b98-kube-api-access-xjpp7\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.870941 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.871022 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.871116 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-run-httpd\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.871929 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-run-httpd\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.875838 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.876787 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-config-data\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.877939 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.878617 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-scripts\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.882590 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:55 crc kubenswrapper[4911]: I0310 14:26:55.891442 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjpp7\" (UniqueName: \"kubernetes.io/projected/1a8ee675-335d-420a-824e-660b4e4f8b98-kube-api-access-xjpp7\") pod \"ceilometer-0\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " pod="openstack/ceilometer-0" Mar 10 14:26:56 crc kubenswrapper[4911]: I0310 14:26:56.009141 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:26:56 crc kubenswrapper[4911]: I0310 14:26:56.226039 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067898a3-878b-4cf6-8780-2e6692f31765" path="/var/lib/kubelet/pods/067898a3-878b-4cf6-8780-2e6692f31765/volumes" Mar 10 14:26:56 crc kubenswrapper[4911]: I0310 14:26:56.232221 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:26:56 crc kubenswrapper[4911]: I0310 14:26:56.401024 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:56 crc kubenswrapper[4911]: I0310 14:26:56.523163 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:26:56 crc kubenswrapper[4911]: I0310 14:26:56.579932 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerStarted","Data":"bf85dc55ae34d8e6079adae8f776690f378e2e4bf9555114df8078569023eaf3"} Mar 10 14:26:57 crc kubenswrapper[4911]: I0310 14:26:57.590803 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerStarted","Data":"12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f"} Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.220624 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.258457 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-combined-ca-bundle\") pod \"d29545ce-9c60-44b8-8c4f-54b490100dea\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.258759 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d29545ce-9c60-44b8-8c4f-54b490100dea-logs\") pod \"d29545ce-9c60-44b8-8c4f-54b490100dea\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.258802 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-config-data\") pod \"d29545ce-9c60-44b8-8c4f-54b490100dea\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.258900 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzkck\" (UniqueName: \"kubernetes.io/projected/d29545ce-9c60-44b8-8c4f-54b490100dea-kube-api-access-qzkck\") pod \"d29545ce-9c60-44b8-8c4f-54b490100dea\" (UID: \"d29545ce-9c60-44b8-8c4f-54b490100dea\") " Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.259181 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29545ce-9c60-44b8-8c4f-54b490100dea-logs" (OuterVolumeSpecName: "logs") pod "d29545ce-9c60-44b8-8c4f-54b490100dea" (UID: "d29545ce-9c60-44b8-8c4f-54b490100dea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.261068 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d29545ce-9c60-44b8-8c4f-54b490100dea-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.272430 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29545ce-9c60-44b8-8c4f-54b490100dea-kube-api-access-qzkck" (OuterVolumeSpecName: "kube-api-access-qzkck") pod "d29545ce-9c60-44b8-8c4f-54b490100dea" (UID: "d29545ce-9c60-44b8-8c4f-54b490100dea"). InnerVolumeSpecName "kube-api-access-qzkck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.301291 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-config-data" (OuterVolumeSpecName: "config-data") pod "d29545ce-9c60-44b8-8c4f-54b490100dea" (UID: "d29545ce-9c60-44b8-8c4f-54b490100dea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.307655 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d29545ce-9c60-44b8-8c4f-54b490100dea" (UID: "d29545ce-9c60-44b8-8c4f-54b490100dea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.363861 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.363902 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzkck\" (UniqueName: \"kubernetes.io/projected/d29545ce-9c60-44b8-8c4f-54b490100dea-kube-api-access-qzkck\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.363915 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d29545ce-9c60-44b8-8c4f-54b490100dea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.612963 4911 generic.go:334] "Generic (PLEG): container finished" podID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerID="7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110" exitCode=0 Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.613559 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d29545ce-9c60-44b8-8c4f-54b490100dea","Type":"ContainerDied","Data":"7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110"} Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.613618 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d29545ce-9c60-44b8-8c4f-54b490100dea","Type":"ContainerDied","Data":"5e489bf303f817431fb829fe7cd7a61e7dd53335fc64c14c15bff628360a2a17"} Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.613644 4911 scope.go:117] "RemoveContainer" containerID="7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.613921 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.666181 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.667011 4911 scope.go:117] "RemoveContainer" containerID="31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.686817 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.718694 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:58 crc kubenswrapper[4911]: E0310 14:26:58.719305 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-api" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.719335 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-api" Mar 10 14:26:58 crc kubenswrapper[4911]: E0310 14:26:58.719364 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-log" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.719371 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-log" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.719573 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-log" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.719607 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" containerName="nova-api-api" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.720797 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.723188 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.724246 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.724544 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.741133 4911 scope.go:117] "RemoveContainer" containerID="7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.741684 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:58 crc kubenswrapper[4911]: E0310 14:26:58.741773 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110\": container with ID starting with 7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110 not found: ID does not exist" containerID="7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.741938 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110"} err="failed to get container status \"7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110\": rpc error: code = NotFound desc = could not find container \"7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110\": container with ID starting with 7bcfa516ea20de05e14e387a7733c89301645d67a8e2170b2e4d9f9f7968b110 not found: ID does not exist" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.742036 4911 scope.go:117] "RemoveContainer" containerID="31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04" Mar 10 14:26:58 crc kubenswrapper[4911]: E0310 14:26:58.745434 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04\": container with ID starting with 31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04 not found: ID does not exist" containerID="31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.745480 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04"} err="failed to get container status \"31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04\": rpc error: code = NotFound desc = could not find container \"31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04\": container with ID starting with 31da837a8f36d7b7433b97de3b09aa6fe75e16aab396e29e3ed8d7c60220fb04 not found: ID does not exist" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.772343 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d58490f-85c6-4ef7-8f2a-38559ad9952c-logs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.772443 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.772497 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-597rs\" (UniqueName: \"kubernetes.io/projected/0d58490f-85c6-4ef7-8f2a-38559ad9952c-kube-api-access-597rs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.772532 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.772603 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-config-data\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.772640 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-public-tls-certs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.877041 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-597rs\" (UniqueName: \"kubernetes.io/projected/0d58490f-85c6-4ef7-8f2a-38559ad9952c-kube-api-access-597rs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.877129 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.877202 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-config-data\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.877226 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-public-tls-certs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.877270 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d58490f-85c6-4ef7-8f2a-38559ad9952c-logs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.877331 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.885019 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d58490f-85c6-4ef7-8f2a-38559ad9952c-logs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.885549 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.886284 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-config-data\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.886303 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-public-tls-certs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.886845 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:58 crc kubenswrapper[4911]: I0310 14:26:58.897330 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-597rs\" (UniqueName: \"kubernetes.io/projected/0d58490f-85c6-4ef7-8f2a-38559ad9952c-kube-api-access-597rs\") pod \"nova-api-0\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " pod="openstack/nova-api-0" Mar 10 14:26:59 crc kubenswrapper[4911]: I0310 14:26:59.049966 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:26:59 crc kubenswrapper[4911]: I0310 14:26:59.643017 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:26:59 crc kubenswrapper[4911]: I0310 14:26:59.740976 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerStarted","Data":"29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1"} Mar 10 14:27:00 crc kubenswrapper[4911]: I0310 14:27:00.208769 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d29545ce-9c60-44b8-8c4f-54b490100dea" path="/var/lib/kubelet/pods/d29545ce-9c60-44b8-8c4f-54b490100dea/volumes" Mar 10 14:27:00 crc kubenswrapper[4911]: I0310 14:27:00.816331 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d58490f-85c6-4ef7-8f2a-38559ad9952c","Type":"ContainerStarted","Data":"e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4"} Mar 10 14:27:00 crc kubenswrapper[4911]: I0310 14:27:00.816670 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d58490f-85c6-4ef7-8f2a-38559ad9952c","Type":"ContainerStarted","Data":"eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5"} Mar 10 14:27:00 crc kubenswrapper[4911]: I0310 14:27:00.816684 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d58490f-85c6-4ef7-8f2a-38559ad9952c","Type":"ContainerStarted","Data":"01e9e948edf89ac62cca605322fbc67932c7ec23f2826a9e1bf684d8eda19122"} Mar 10 14:27:00 crc kubenswrapper[4911]: I0310 14:27:00.833029 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerStarted","Data":"5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9"} Mar 10 14:27:00 crc kubenswrapper[4911]: I0310 14:27:00.851428 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.851405967 podStartE2EDuration="2.851405967s" podCreationTimestamp="2026-03-10 14:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:27:00.848586892 +0000 UTC m=+1525.412106809" watchObservedRunningTime="2026-03-10 14:27:00.851405967 +0000 UTC m=+1525.414925884" Mar 10 14:27:01 crc kubenswrapper[4911]: I0310 14:27:01.205408 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:27:01 crc kubenswrapper[4911]: I0310 14:27:01.230851 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:27:01 crc kubenswrapper[4911]: I0310 14:27:01.877416 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.012910 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.104925 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-cc8d6"] Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.105284 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" podUID="61b95124-a31f-486a-8350-592dd5661c01" containerName="dnsmasq-dns" containerID="cri-o://c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183" gracePeriod=10 Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.223319 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hgssg"] Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.224879 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.232847 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.233150 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.257532 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hgssg"] Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.280953 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6r2\" (UniqueName: \"kubernetes.io/projected/d485c296-ac7f-4d09-ad90-470f8b608207-kube-api-access-8r6r2\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.281010 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-config-data\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.281288 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.281637 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-scripts\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.383006 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.383108 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-scripts\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.383164 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6r2\" (UniqueName: \"kubernetes.io/projected/d485c296-ac7f-4d09-ad90-470f8b608207-kube-api-access-8r6r2\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.383187 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-config-data\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.390696 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-config-data\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.391480 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-scripts\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.393937 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.401935 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6r2\" (UniqueName: \"kubernetes.io/projected/d485c296-ac7f-4d09-ad90-470f8b608207-kube-api-access-8r6r2\") pod \"nova-cell1-cell-mapping-hgssg\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.590593 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.676495 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.797371 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-config\") pod \"61b95124-a31f-486a-8350-592dd5661c01\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.797894 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-nb\") pod \"61b95124-a31f-486a-8350-592dd5661c01\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.797928 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-svc\") pod \"61b95124-a31f-486a-8350-592dd5661c01\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.798032 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg6st\" (UniqueName: \"kubernetes.io/projected/61b95124-a31f-486a-8350-592dd5661c01-kube-api-access-cg6st\") pod \"61b95124-a31f-486a-8350-592dd5661c01\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.798098 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-swift-storage-0\") pod \"61b95124-a31f-486a-8350-592dd5661c01\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.798154 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-sb\") pod \"61b95124-a31f-486a-8350-592dd5661c01\" (UID: \"61b95124-a31f-486a-8350-592dd5661c01\") " Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.819091 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b95124-a31f-486a-8350-592dd5661c01-kube-api-access-cg6st" (OuterVolumeSpecName: "kube-api-access-cg6st") pod "61b95124-a31f-486a-8350-592dd5661c01" (UID: "61b95124-a31f-486a-8350-592dd5661c01"). InnerVolumeSpecName "kube-api-access-cg6st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.869621 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerStarted","Data":"5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d"} Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.869928 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="ceilometer-central-agent" containerID="cri-o://12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f" gracePeriod=30 Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.870257 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.870597 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="proxy-httpd" containerID="cri-o://5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d" gracePeriod=30 Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.870651 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="sg-core" containerID="cri-o://5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9" gracePeriod=30 Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.870689 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="ceilometer-notification-agent" containerID="cri-o://29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1" gracePeriod=30 Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.877789 4911 generic.go:334] "Generic (PLEG): container finished" podID="61b95124-a31f-486a-8350-592dd5661c01" containerID="c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183" exitCode=0 Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.877827 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.877865 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" event={"ID":"61b95124-a31f-486a-8350-592dd5661c01","Type":"ContainerDied","Data":"c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183"} Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.877894 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-cc8d6" event={"ID":"61b95124-a31f-486a-8350-592dd5661c01","Type":"ContainerDied","Data":"89a9098b330d1d36a68405faac7d2c0e836afabc05e0e13acb6eb428f961a868"} Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.877912 4911 scope.go:117] "RemoveContainer" containerID="c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.888918 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61b95124-a31f-486a-8350-592dd5661c01" (UID: "61b95124-a31f-486a-8350-592dd5661c01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.893400 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-config" (OuterVolumeSpecName: "config") pod "61b95124-a31f-486a-8350-592dd5661c01" (UID: "61b95124-a31f-486a-8350-592dd5661c01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.899943 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.07066729 podStartE2EDuration="7.899899491s" podCreationTimestamp="2026-03-10 14:26:55 +0000 UTC" firstStartedPulling="2026-03-10 14:26:56.554587131 +0000 UTC m=+1521.118107058" lastFinishedPulling="2026-03-10 14:27:02.383819342 +0000 UTC m=+1526.947339259" observedRunningTime="2026-03-10 14:27:02.898315449 +0000 UTC m=+1527.461835366" watchObservedRunningTime="2026-03-10 14:27:02.899899491 +0000 UTC m=+1527.463419408" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.901269 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.901302 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.901312 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg6st\" (UniqueName: \"kubernetes.io/projected/61b95124-a31f-486a-8350-592dd5661c01-kube-api-access-cg6st\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.904612 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61b95124-a31f-486a-8350-592dd5661c01" (UID: "61b95124-a31f-486a-8350-592dd5661c01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.915941 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "61b95124-a31f-486a-8350-592dd5661c01" (UID: "61b95124-a31f-486a-8350-592dd5661c01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.920303 4911 scope.go:117] "RemoveContainer" containerID="8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.921053 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "61b95124-a31f-486a-8350-592dd5661c01" (UID: "61b95124-a31f-486a-8350-592dd5661c01"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.967399 4911 scope.go:117] "RemoveContainer" containerID="c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183" Mar 10 14:27:02 crc kubenswrapper[4911]: E0310 14:27:02.969810 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183\": container with ID starting with c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183 not found: ID does not exist" containerID="c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.969894 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183"} err="failed to get container status \"c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183\": rpc error: code = NotFound desc = could not find container \"c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183\": container with ID starting with c6d725479f75c6c25fe9b22aa2e5068666a6b4551fe06d038713c4c9a2881183 not found: ID does not exist" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.969935 4911 scope.go:117] "RemoveContainer" containerID="8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601" Mar 10 14:27:02 crc kubenswrapper[4911]: E0310 14:27:02.970571 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601\": container with ID starting with 8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601 not found: ID does not exist" containerID="8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601" Mar 10 14:27:02 crc kubenswrapper[4911]: I0310 14:27:02.970621 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601"} err="failed to get container status \"8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601\": rpc error: code = NotFound desc = could not find container \"8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601\": container with ID starting with 8c9503db1c9c91e02cafc5d09faac275888a4066f9a9fef29632411ea1f0a601 not found: ID does not exist" Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.003508 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.004249 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.004270 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61b95124-a31f-486a-8350-592dd5661c01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.146593 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hgssg"] Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.225765 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-cc8d6"] Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.239613 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-cc8d6"] Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.916351 4911 generic.go:334] "Generic (PLEG): container finished" podID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerID="5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9" exitCode=2 Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.916385 4911 generic.go:334] "Generic (PLEG): container finished" podID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerID="29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1" exitCode=0 Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.916446 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerDied","Data":"5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9"} Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.916484 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerDied","Data":"29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1"} Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.918001 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hgssg" event={"ID":"d485c296-ac7f-4d09-ad90-470f8b608207","Type":"ContainerStarted","Data":"628cde62ea5a91c5b4ffbc915f3335d589fc2c767f9fb9a8faeaeb3dbd805596"} Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.918032 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hgssg" event={"ID":"d485c296-ac7f-4d09-ad90-470f8b608207","Type":"ContainerStarted","Data":"9bbd33ea6522e3270b6d6fe422d09ef97ddc69ec18cf79d29088f86608ffebcb"} Mar 10 14:27:03 crc kubenswrapper[4911]: I0310 14:27:03.976413 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hgssg" podStartSLOduration=1.976381712 podStartE2EDuration="1.976381712s" podCreationTimestamp="2026-03-10 14:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:27:03.955132418 +0000 UTC m=+1528.518652345" watchObservedRunningTime="2026-03-10 14:27:03.976381712 +0000 UTC m=+1528.539901629" Mar 10 14:27:04 crc kubenswrapper[4911]: I0310 14:27:04.208469 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b95124-a31f-486a-8350-592dd5661c01" path="/var/lib/kubelet/pods/61b95124-a31f-486a-8350-592dd5661c01/volumes" Mar 10 14:27:04 crc kubenswrapper[4911]: I0310 14:27:04.938778 4911 generic.go:334] "Generic (PLEG): container finished" podID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerID="12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f" exitCode=0 Mar 10 14:27:04 crc kubenswrapper[4911]: I0310 14:27:04.940553 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerDied","Data":"12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f"} Mar 10 14:27:09 crc kubenswrapper[4911]: I0310 14:27:08.999775 4911 generic.go:334] "Generic (PLEG): container finished" podID="d485c296-ac7f-4d09-ad90-470f8b608207" containerID="628cde62ea5a91c5b4ffbc915f3335d589fc2c767f9fb9a8faeaeb3dbd805596" exitCode=0 Mar 10 14:27:09 crc kubenswrapper[4911]: I0310 14:27:08.999909 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hgssg" event={"ID":"d485c296-ac7f-4d09-ad90-470f8b608207","Type":"ContainerDied","Data":"628cde62ea5a91c5b4ffbc915f3335d589fc2c767f9fb9a8faeaeb3dbd805596"} Mar 10 14:27:09 crc kubenswrapper[4911]: I0310 14:27:09.051006 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 14:27:09 crc kubenswrapper[4911]: I0310 14:27:09.051086 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.065986 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.065979 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.421015 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.591478 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r6r2\" (UniqueName: \"kubernetes.io/projected/d485c296-ac7f-4d09-ad90-470f8b608207-kube-api-access-8r6r2\") pod \"d485c296-ac7f-4d09-ad90-470f8b608207\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.592120 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-config-data\") pod \"d485c296-ac7f-4d09-ad90-470f8b608207\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.592249 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-combined-ca-bundle\") pod \"d485c296-ac7f-4d09-ad90-470f8b608207\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.592292 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-scripts\") pod \"d485c296-ac7f-4d09-ad90-470f8b608207\" (UID: \"d485c296-ac7f-4d09-ad90-470f8b608207\") " Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.612855 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-scripts" (OuterVolumeSpecName: "scripts") pod "d485c296-ac7f-4d09-ad90-470f8b608207" (UID: "d485c296-ac7f-4d09-ad90-470f8b608207"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.613625 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d485c296-ac7f-4d09-ad90-470f8b608207-kube-api-access-8r6r2" (OuterVolumeSpecName: "kube-api-access-8r6r2") pod "d485c296-ac7f-4d09-ad90-470f8b608207" (UID: "d485c296-ac7f-4d09-ad90-470f8b608207"). InnerVolumeSpecName "kube-api-access-8r6r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.628625 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d485c296-ac7f-4d09-ad90-470f8b608207" (UID: "d485c296-ac7f-4d09-ad90-470f8b608207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.640365 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-config-data" (OuterVolumeSpecName: "config-data") pod "d485c296-ac7f-4d09-ad90-470f8b608207" (UID: "d485c296-ac7f-4d09-ad90-470f8b608207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.695153 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.695198 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.695213 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c296-ac7f-4d09-ad90-470f8b608207-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:10 crc kubenswrapper[4911]: I0310 14:27:10.695225 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r6r2\" (UniqueName: \"kubernetes.io/projected/d485c296-ac7f-4d09-ad90-470f8b608207-kube-api-access-8r6r2\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.025447 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hgssg" event={"ID":"d485c296-ac7f-4d09-ad90-470f8b608207","Type":"ContainerDied","Data":"9bbd33ea6522e3270b6d6fe422d09ef97ddc69ec18cf79d29088f86608ffebcb"} Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.025507 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bbd33ea6522e3270b6d6fe422d09ef97ddc69ec18cf79d29088f86608ffebcb" Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.025531 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hgssg" Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.215978 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.216509 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-log" containerID="cri-o://eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5" gracePeriod=30 Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.216610 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-api" containerID="cri-o://e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4" gracePeriod=30 Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.282250 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.282646 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3430ae8c-7569-4d15-88cf-84ff4d6cc01a" containerName="nova-scheduler-scheduler" containerID="cri-o://63f75d5fcd57d05aabd0fb36bdd533eb30af6466e4266f45b52280767613c897" gracePeriod=30 Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.301539 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.301871 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-log" containerID="cri-o://1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55" gracePeriod=30 Mar 10 14:27:11 crc kubenswrapper[4911]: I0310 14:27:11.302394 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-metadata" containerID="cri-o://cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4" gracePeriod=30 Mar 10 14:27:12 crc kubenswrapper[4911]: I0310 14:27:12.036451 4911 generic.go:334] "Generic (PLEG): container finished" podID="c0507925-e9be-40d0-96ad-2a99166d3674" containerID="1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55" exitCode=143 Mar 10 14:27:12 crc kubenswrapper[4911]: I0310 14:27:12.036533 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0507925-e9be-40d0-96ad-2a99166d3674","Type":"ContainerDied","Data":"1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55"} Mar 10 14:27:12 crc kubenswrapper[4911]: I0310 14:27:12.038378 4911 generic.go:334] "Generic (PLEG): container finished" podID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerID="eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5" exitCode=143 Mar 10 14:27:12 crc kubenswrapper[4911]: I0310 14:27:12.038416 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d58490f-85c6-4ef7-8f2a-38559ad9952c","Type":"ContainerDied","Data":"eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5"} Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.051059 4911 generic.go:334] "Generic (PLEG): container finished" podID="3430ae8c-7569-4d15-88cf-84ff4d6cc01a" containerID="63f75d5fcd57d05aabd0fb36bdd533eb30af6466e4266f45b52280767613c897" exitCode=0 Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.051174 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3430ae8c-7569-4d15-88cf-84ff4d6cc01a","Type":"ContainerDied","Data":"63f75d5fcd57d05aabd0fb36bdd533eb30af6466e4266f45b52280767613c897"} Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.204899 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.352427 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9zl9\" (UniqueName: \"kubernetes.io/projected/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-kube-api-access-c9zl9\") pod \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.352496 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-combined-ca-bundle\") pod \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.352705 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-config-data\") pod \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\" (UID: \"3430ae8c-7569-4d15-88cf-84ff4d6cc01a\") " Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.362542 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-kube-api-access-c9zl9" (OuterVolumeSpecName: "kube-api-access-c9zl9") pod "3430ae8c-7569-4d15-88cf-84ff4d6cc01a" (UID: "3430ae8c-7569-4d15-88cf-84ff4d6cc01a"). InnerVolumeSpecName "kube-api-access-c9zl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.391944 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-config-data" (OuterVolumeSpecName: "config-data") pod "3430ae8c-7569-4d15-88cf-84ff4d6cc01a" (UID: "3430ae8c-7569-4d15-88cf-84ff4d6cc01a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.394055 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3430ae8c-7569-4d15-88cf-84ff4d6cc01a" (UID: "3430ae8c-7569-4d15-88cf-84ff4d6cc01a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.455614 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.455657 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9zl9\" (UniqueName: \"kubernetes.io/projected/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-kube-api-access-c9zl9\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:13 crc kubenswrapper[4911]: I0310 14:27:13.455673 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3430ae8c-7569-4d15-88cf-84ff4d6cc01a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.061663 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3430ae8c-7569-4d15-88cf-84ff4d6cc01a","Type":"ContainerDied","Data":"791bb90ca049dd1dc502b37c7a5f1e0563d9e76c39788230acf4fd6b65b3bff5"} Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.061751 4911 scope.go:117] "RemoveContainer" containerID="63f75d5fcd57d05aabd0fb36bdd533eb30af6466e4266f45b52280767613c897" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.061879 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.100931 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.118320 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.137823 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:27:14 crc kubenswrapper[4911]: E0310 14:27:14.138522 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d485c296-ac7f-4d09-ad90-470f8b608207" containerName="nova-manage" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.138551 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d485c296-ac7f-4d09-ad90-470f8b608207" containerName="nova-manage" Mar 10 14:27:14 crc kubenswrapper[4911]: E0310 14:27:14.138596 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3430ae8c-7569-4d15-88cf-84ff4d6cc01a" containerName="nova-scheduler-scheduler" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.138611 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3430ae8c-7569-4d15-88cf-84ff4d6cc01a" containerName="nova-scheduler-scheduler" Mar 10 14:27:14 crc kubenswrapper[4911]: E0310 14:27:14.138642 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b95124-a31f-486a-8350-592dd5661c01" containerName="dnsmasq-dns" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.138651 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b95124-a31f-486a-8350-592dd5661c01" containerName="dnsmasq-dns" Mar 10 14:27:14 crc kubenswrapper[4911]: E0310 14:27:14.138675 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b95124-a31f-486a-8350-592dd5661c01" containerName="init" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.138683 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b95124-a31f-486a-8350-592dd5661c01" containerName="init" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.138937 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d485c296-ac7f-4d09-ad90-470f8b608207" containerName="nova-manage" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.138967 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3430ae8c-7569-4d15-88cf-84ff4d6cc01a" containerName="nova-scheduler-scheduler" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.138985 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b95124-a31f-486a-8350-592dd5661c01" containerName="dnsmasq-dns" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.139843 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.148039 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.150089 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.208694 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3430ae8c-7569-4d15-88cf-84ff4d6cc01a" path="/var/lib/kubelet/pods/3430ae8c-7569-4d15-88cf-84ff4d6cc01a/volumes" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.273963 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e72e32-585d-4c71-9788-fd40c839f2ed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.274554 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e72e32-585d-4c71-9788-fd40c839f2ed-config-data\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.274751 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zl2s\" (UniqueName: \"kubernetes.io/projected/f3e72e32-585d-4c71-9788-fd40c839f2ed-kube-api-access-2zl2s\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.378162 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e72e32-585d-4c71-9788-fd40c839f2ed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.378537 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e72e32-585d-4c71-9788-fd40c839f2ed-config-data\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.378620 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zl2s\" (UniqueName: \"kubernetes.io/projected/f3e72e32-585d-4c71-9788-fd40c839f2ed-kube-api-access-2zl2s\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.385450 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e72e32-585d-4c71-9788-fd40c839f2ed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.385456 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e72e32-585d-4c71-9788-fd40c839f2ed-config-data\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.400664 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zl2s\" (UniqueName: \"kubernetes.io/projected/f3e72e32-585d-4c71-9788-fd40c839f2ed-kube-api-access-2zl2s\") pod \"nova-scheduler-0\" (UID: \"f3e72e32-585d-4c71-9788-fd40c839f2ed\") " pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.440998 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:33018->10.217.0.204:8775: read: connection reset by peer" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.441045 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:33020->10.217.0.204:8775: read: connection reset by peer" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.466889 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.934105 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.994460 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0507925-e9be-40d0-96ad-2a99166d3674-logs\") pod \"c0507925-e9be-40d0-96ad-2a99166d3674\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.994567 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-combined-ca-bundle\") pod \"c0507925-e9be-40d0-96ad-2a99166d3674\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.994696 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-config-data\") pod \"c0507925-e9be-40d0-96ad-2a99166d3674\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.994754 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-nova-metadata-tls-certs\") pod \"c0507925-e9be-40d0-96ad-2a99166d3674\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.994866 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qxk7\" (UniqueName: \"kubernetes.io/projected/c0507925-e9be-40d0-96ad-2a99166d3674-kube-api-access-8qxk7\") pod \"c0507925-e9be-40d0-96ad-2a99166d3674\" (UID: \"c0507925-e9be-40d0-96ad-2a99166d3674\") " Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.995269 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0507925-e9be-40d0-96ad-2a99166d3674-logs" (OuterVolumeSpecName: "logs") pod "c0507925-e9be-40d0-96ad-2a99166d3674" (UID: "c0507925-e9be-40d0-96ad-2a99166d3674"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:27:14 crc kubenswrapper[4911]: I0310 14:27:14.995476 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0507925-e9be-40d0-96ad-2a99166d3674-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.004135 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0507925-e9be-40d0-96ad-2a99166d3674-kube-api-access-8qxk7" (OuterVolumeSpecName: "kube-api-access-8qxk7") pod "c0507925-e9be-40d0-96ad-2a99166d3674" (UID: "c0507925-e9be-40d0-96ad-2a99166d3674"). InnerVolumeSpecName "kube-api-access-8qxk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.027036 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0507925-e9be-40d0-96ad-2a99166d3674" (UID: "c0507925-e9be-40d0-96ad-2a99166d3674"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.036804 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-config-data" (OuterVolumeSpecName: "config-data") pod "c0507925-e9be-40d0-96ad-2a99166d3674" (UID: "c0507925-e9be-40d0-96ad-2a99166d3674"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.063014 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c0507925-e9be-40d0-96ad-2a99166d3674" (UID: "c0507925-e9be-40d0-96ad-2a99166d3674"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.075968 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.080164 4911 generic.go:334] "Generic (PLEG): container finished" podID="c0507925-e9be-40d0-96ad-2a99166d3674" containerID="cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4" exitCode=0 Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.080218 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0507925-e9be-40d0-96ad-2a99166d3674","Type":"ContainerDied","Data":"cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4"} Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.080252 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0507925-e9be-40d0-96ad-2a99166d3674","Type":"ContainerDied","Data":"6977bf98c8350ee451e34cf0cdb224cfccb6f95280995302e69396f65f4addd8"} Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.080279 4911 scope.go:117] "RemoveContainer" containerID="cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.080432 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.097583 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.097616 4911 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.097629 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qxk7\" (UniqueName: \"kubernetes.io/projected/c0507925-e9be-40d0-96ad-2a99166d3674-kube-api-access-8qxk7\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.097643 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0507925-e9be-40d0-96ad-2a99166d3674-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.125251 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.127816 4911 scope.go:117] "RemoveContainer" containerID="1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.143084 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.157192 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:27:15 crc kubenswrapper[4911]: E0310 14:27:15.158303 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-metadata" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.159069 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-metadata" Mar 10 14:27:15 crc kubenswrapper[4911]: E0310 14:27:15.159153 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-log" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.159640 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-log" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.160014 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-metadata" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.160035 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" containerName="nova-metadata-log" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.161339 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.176368 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.177002 4911 scope.go:117] "RemoveContainer" containerID="cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.178461 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 14:27:15 crc kubenswrapper[4911]: E0310 14:27:15.179274 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4\": container with ID starting with cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4 not found: ID does not exist" containerID="cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.179355 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4"} err="failed to get container status \"cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4\": rpc error: code = NotFound desc = could not find container \"cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4\": container with ID starting with cbbf43be534b590e7fc35ec4e062543b7d3caa4949d9ae0c789ce57c00922cb4 not found: ID does not exist" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.179384 4911 scope.go:117] "RemoveContainer" containerID="1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.185516 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:27:15 crc kubenswrapper[4911]: E0310 14:27:15.185661 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55\": container with ID starting with 1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55 not found: ID does not exist" containerID="1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.185784 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55"} err="failed to get container status \"1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55\": rpc error: code = NotFound desc = could not find container \"1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55\": container with ID starting with 1020cea10eeb164c46f03670af30c73b7c3d670b8ab3f39d41aada09e2cbdf55 not found: ID does not exist" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.309021 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8zk\" (UniqueName: \"kubernetes.io/projected/607959a6-b845-45ea-b09a-966237b6dd1a-kube-api-access-mc8zk\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.309183 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/607959a6-b845-45ea-b09a-966237b6dd1a-logs\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.309228 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.309320 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-config-data\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.309405 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.411406 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc8zk\" (UniqueName: \"kubernetes.io/projected/607959a6-b845-45ea-b09a-966237b6dd1a-kube-api-access-mc8zk\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.411542 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/607959a6-b845-45ea-b09a-966237b6dd1a-logs\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.411573 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.411644 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-config-data\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.411992 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.412275 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/607959a6-b845-45ea-b09a-966237b6dd1a-logs\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.416298 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-config-data\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.417086 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.417121 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/607959a6-b845-45ea-b09a-966237b6dd1a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.430443 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc8zk\" (UniqueName: \"kubernetes.io/projected/607959a6-b845-45ea-b09a-966237b6dd1a-kube-api-access-mc8zk\") pod \"nova-metadata-0\" (UID: \"607959a6-b845-45ea-b09a-966237b6dd1a\") " pod="openstack/nova-metadata-0" Mar 10 14:27:15 crc kubenswrapper[4911]: I0310 14:27:15.503186 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.008578 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.068262 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.108991 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3e72e32-585d-4c71-9788-fd40c839f2ed","Type":"ContainerStarted","Data":"2ba3e97c1287c2f42877c84ffafcd97d77de90f1e93006f15bd97d9ceac50595"} Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.109406 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3e72e32-585d-4c71-9788-fd40c839f2ed","Type":"ContainerStarted","Data":"a361b713ee10e2e2d9753e2a18300e9bdc2d4f820b02ca2b197903d4d9ceb21b"} Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.113975 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"607959a6-b845-45ea-b09a-966237b6dd1a","Type":"ContainerStarted","Data":"d96d435154dace230557003e0cad5acf3b6809d82b7e1c0631b3068f678dc047"} Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.125775 4911 generic.go:334] "Generic (PLEG): container finished" podID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerID="e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4" exitCode=0 Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.125851 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d58490f-85c6-4ef7-8f2a-38559ad9952c","Type":"ContainerDied","Data":"e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4"} Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.125914 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d58490f-85c6-4ef7-8f2a-38559ad9952c","Type":"ContainerDied","Data":"01e9e948edf89ac62cca605322fbc67932c7ec23f2826a9e1bf684d8eda19122"} Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.125931 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.125936 4911 scope.go:117] "RemoveContainer" containerID="e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.134400 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-597rs\" (UniqueName: \"kubernetes.io/projected/0d58490f-85c6-4ef7-8f2a-38559ad9952c-kube-api-access-597rs\") pod \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.134525 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-internal-tls-certs\") pod \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.136665 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-config-data\") pod \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.137987 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d58490f-85c6-4ef7-8f2a-38559ad9952c-logs\") pod \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.138021 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-combined-ca-bundle\") pod \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.138176 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-public-tls-certs\") pod \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\" (UID: \"0d58490f-85c6-4ef7-8f2a-38559ad9952c\") " Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.139972 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d58490f-85c6-4ef7-8f2a-38559ad9952c-logs" (OuterVolumeSpecName: "logs") pod "0d58490f-85c6-4ef7-8f2a-38559ad9952c" (UID: "0d58490f-85c6-4ef7-8f2a-38559ad9952c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.140277 4911 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d58490f-85c6-4ef7-8f2a-38559ad9952c-logs\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.141835 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d58490f-85c6-4ef7-8f2a-38559ad9952c-kube-api-access-597rs" (OuterVolumeSpecName: "kube-api-access-597rs") pod "0d58490f-85c6-4ef7-8f2a-38559ad9952c" (UID: "0d58490f-85c6-4ef7-8f2a-38559ad9952c"). InnerVolumeSpecName "kube-api-access-597rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.152933 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.152898043 podStartE2EDuration="2.152898043s" podCreationTimestamp="2026-03-10 14:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:27:16.129826481 +0000 UTC m=+1540.693346418" watchObservedRunningTime="2026-03-10 14:27:16.152898043 +0000 UTC m=+1540.716417960" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.192162 4911 scope.go:117] "RemoveContainer" containerID="eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.207067 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-config-data" (OuterVolumeSpecName: "config-data") pod "0d58490f-85c6-4ef7-8f2a-38559ad9952c" (UID: "0d58490f-85c6-4ef7-8f2a-38559ad9952c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.209158 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0507925-e9be-40d0-96ad-2a99166d3674" path="/var/lib/kubelet/pods/c0507925-e9be-40d0-96ad-2a99166d3674/volumes" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.218528 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d58490f-85c6-4ef7-8f2a-38559ad9952c" (UID: "0d58490f-85c6-4ef7-8f2a-38559ad9952c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.238794 4911 scope.go:117] "RemoveContainer" containerID="e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4" Mar 10 14:27:16 crc kubenswrapper[4911]: E0310 14:27:16.239547 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4\": container with ID starting with e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4 not found: ID does not exist" containerID="e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.239666 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4"} err="failed to get container status \"e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4\": rpc error: code = NotFound desc = could not find container \"e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4\": container with ID starting with e36498b3ee35d370b7c7d85e3d8d021b2e78eeb7a06cd00ebc14fc28cb0b48d4 not found: ID does not exist" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.239797 4911 scope.go:117] "RemoveContainer" containerID="eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5" Mar 10 14:27:16 crc kubenswrapper[4911]: E0310 14:27:16.240273 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5\": container with ID starting with eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5 not found: ID does not exist" containerID="eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.240319 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5"} err="failed to get container status \"eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5\": rpc error: code = NotFound desc = could not find container \"eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5\": container with ID starting with eb9a381b9486b97961e754ae5ab20e6e175491667cc5622306c82b71167fd7f5 not found: ID does not exist" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.242168 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.242881 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.242899 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-597rs\" (UniqueName: \"kubernetes.io/projected/0d58490f-85c6-4ef7-8f2a-38559ad9952c-kube-api-access-597rs\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.252970 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0d58490f-85c6-4ef7-8f2a-38559ad9952c" (UID: "0d58490f-85c6-4ef7-8f2a-38559ad9952c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.258992 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0d58490f-85c6-4ef7-8f2a-38559ad9952c" (UID: "0d58490f-85c6-4ef7-8f2a-38559ad9952c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.345317 4911 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.345499 4911 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d58490f-85c6-4ef7-8f2a-38559ad9952c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.516170 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.541601 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.552203 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 14:27:16 crc kubenswrapper[4911]: E0310 14:27:16.552830 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-log" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.552855 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-log" Mar 10 14:27:16 crc kubenswrapper[4911]: E0310 14:27:16.552890 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-api" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.552899 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-api" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.553159 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-log" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.553192 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" containerName="nova-api-api" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.555508 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.570885 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.571125 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.571274 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.593905 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.651521 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-public-tls-certs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.651678 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.651709 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17b26bd6-b922-485e-9655-001a52e6731c-logs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.651814 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-config-data\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.651857 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.651900 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hph2\" (UniqueName: \"kubernetes.io/projected/17b26bd6-b922-485e-9655-001a52e6731c-kube-api-access-6hph2\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.753619 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hph2\" (UniqueName: \"kubernetes.io/projected/17b26bd6-b922-485e-9655-001a52e6731c-kube-api-access-6hph2\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.754027 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-public-tls-certs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.754231 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.754306 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17b26bd6-b922-485e-9655-001a52e6731c-logs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.754395 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-config-data\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.754519 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.754973 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17b26bd6-b922-485e-9655-001a52e6731c-logs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.762647 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-public-tls-certs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.762839 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.764156 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-config-data\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.770882 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b26bd6-b922-485e-9655-001a52e6731c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.773024 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hph2\" (UniqueName: \"kubernetes.io/projected/17b26bd6-b922-485e-9655-001a52e6731c-kube-api-access-6hph2\") pod \"nova-api-0\" (UID: \"17b26bd6-b922-485e-9655-001a52e6731c\") " pod="openstack/nova-api-0" Mar 10 14:27:16 crc kubenswrapper[4911]: I0310 14:27:16.890874 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 14:27:17 crc kubenswrapper[4911]: I0310 14:27:17.163140 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"607959a6-b845-45ea-b09a-966237b6dd1a","Type":"ContainerStarted","Data":"66e709bb696962b3384562d4b8fe88d63193e9794ff8f68afc318d8ac0eb1ee4"} Mar 10 14:27:17 crc kubenswrapper[4911]: I0310 14:27:17.166004 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"607959a6-b845-45ea-b09a-966237b6dd1a","Type":"ContainerStarted","Data":"c3e25d617e2751e742a5c9f67471f2dd093a792c76d10e99308f5caa8e236738"} Mar 10 14:27:17 crc kubenswrapper[4911]: I0310 14:27:17.203157 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.203128219 podStartE2EDuration="2.203128219s" podCreationTimestamp="2026-03-10 14:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:27:17.184792852 +0000 UTC m=+1541.748312799" watchObservedRunningTime="2026-03-10 14:27:17.203128219 +0000 UTC m=+1541.766648156" Mar 10 14:27:17 crc kubenswrapper[4911]: I0310 14:27:17.481033 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 14:27:18 crc kubenswrapper[4911]: I0310 14:27:18.180096 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17b26bd6-b922-485e-9655-001a52e6731c","Type":"ContainerStarted","Data":"92beca2162d169153308a2dfd43212dda910c4a210f8d668029b8c3c33a113ab"} Mar 10 14:27:18 crc kubenswrapper[4911]: I0310 14:27:18.180602 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17b26bd6-b922-485e-9655-001a52e6731c","Type":"ContainerStarted","Data":"89f02d1411f9a9a0e89acaff6db48c60c363911466d60108a531240bdc0dd687"} Mar 10 14:27:18 crc kubenswrapper[4911]: I0310 14:27:18.180615 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17b26bd6-b922-485e-9655-001a52e6731c","Type":"ContainerStarted","Data":"cf754e12a23a3a41336a47d1687f4b6ce6621e32fea7f17962030c261372fdec"} Mar 10 14:27:18 crc kubenswrapper[4911]: I0310 14:27:18.217185 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.217155054 podStartE2EDuration="2.217155054s" podCreationTimestamp="2026-03-10 14:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:27:18.199995529 +0000 UTC m=+1542.763515456" watchObservedRunningTime="2026-03-10 14:27:18.217155054 +0000 UTC m=+1542.780674971" Mar 10 14:27:18 crc kubenswrapper[4911]: I0310 14:27:18.233087 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d58490f-85c6-4ef7-8f2a-38559ad9952c" path="/var/lib/kubelet/pods/0d58490f-85c6-4ef7-8f2a-38559ad9952c/volumes" Mar 10 14:27:18 crc kubenswrapper[4911]: I0310 14:27:18.521902 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:27:18 crc kubenswrapper[4911]: I0310 14:27:18.522004 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:27:19 crc kubenswrapper[4911]: I0310 14:27:19.467860 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 14:27:20 crc kubenswrapper[4911]: I0310 14:27:20.504876 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 14:27:20 crc kubenswrapper[4911]: I0310 14:27:20.504966 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 14:27:24 crc kubenswrapper[4911]: I0310 14:27:24.467845 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 14:27:24 crc kubenswrapper[4911]: I0310 14:27:24.497099 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 14:27:25 crc kubenswrapper[4911]: I0310 14:27:25.305841 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 14:27:25 crc kubenswrapper[4911]: I0310 14:27:25.507067 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 14:27:25 crc kubenswrapper[4911]: I0310 14:27:25.507185 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 14:27:26 crc kubenswrapper[4911]: I0310 14:27:26.021258 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 10 14:27:26 crc kubenswrapper[4911]: I0310 14:27:26.514928 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="607959a6-b845-45ea-b09a-966237b6dd1a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 14:27:26 crc kubenswrapper[4911]: I0310 14:27:26.514979 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="607959a6-b845-45ea-b09a-966237b6dd1a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 14:27:26 crc kubenswrapper[4911]: I0310 14:27:26.894462 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 14:27:26 crc kubenswrapper[4911]: I0310 14:27:26.894560 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 14:27:27 crc kubenswrapper[4911]: I0310 14:27:27.906914 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="17b26bd6-b922-485e-9655-001a52e6731c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 14:27:27 crc kubenswrapper[4911]: I0310 14:27:27.906980 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="17b26bd6-b922-485e-9655-001a52e6731c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.301842 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.352695 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-log-httpd\") pod \"1a8ee675-335d-420a-824e-660b4e4f8b98\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.352807 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjpp7\" (UniqueName: \"kubernetes.io/projected/1a8ee675-335d-420a-824e-660b4e4f8b98-kube-api-access-xjpp7\") pod \"1a8ee675-335d-420a-824e-660b4e4f8b98\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.352871 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-config-data\") pod \"1a8ee675-335d-420a-824e-660b4e4f8b98\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.352899 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-combined-ca-bundle\") pod \"1a8ee675-335d-420a-824e-660b4e4f8b98\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.352926 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-ceilometer-tls-certs\") pod \"1a8ee675-335d-420a-824e-660b4e4f8b98\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.353171 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-sg-core-conf-yaml\") pod \"1a8ee675-335d-420a-824e-660b4e4f8b98\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.353195 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-run-httpd\") pod \"1a8ee675-335d-420a-824e-660b4e4f8b98\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.353214 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-scripts\") pod \"1a8ee675-335d-420a-824e-660b4e4f8b98\" (UID: \"1a8ee675-335d-420a-824e-660b4e4f8b98\") " Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.356689 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a8ee675-335d-420a-824e-660b4e4f8b98" (UID: "1a8ee675-335d-420a-824e-660b4e4f8b98"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.357298 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a8ee675-335d-420a-824e-660b4e4f8b98" (UID: "1a8ee675-335d-420a-824e-660b4e4f8b98"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.360400 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8ee675-335d-420a-824e-660b4e4f8b98-kube-api-access-xjpp7" (OuterVolumeSpecName: "kube-api-access-xjpp7") pod "1a8ee675-335d-420a-824e-660b4e4f8b98" (UID: "1a8ee675-335d-420a-824e-660b4e4f8b98"). InnerVolumeSpecName "kube-api-access-xjpp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.360919 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-scripts" (OuterVolumeSpecName: "scripts") pod "1a8ee675-335d-420a-824e-660b4e4f8b98" (UID: "1a8ee675-335d-420a-824e-660b4e4f8b98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.374561 4911 generic.go:334] "Generic (PLEG): container finished" podID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerID="5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d" exitCode=137 Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.374622 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerDied","Data":"5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d"} Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.374635 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.374662 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a8ee675-335d-420a-824e-660b4e4f8b98","Type":"ContainerDied","Data":"bf85dc55ae34d8e6079adae8f776690f378e2e4bf9555114df8078569023eaf3"} Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.374684 4911 scope.go:117] "RemoveContainer" containerID="5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.397700 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a8ee675-335d-420a-824e-660b4e4f8b98" (UID: "1a8ee675-335d-420a-824e-660b4e4f8b98"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.425774 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1a8ee675-335d-420a-824e-660b4e4f8b98" (UID: "1a8ee675-335d-420a-824e-660b4e4f8b98"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.438342 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a8ee675-335d-420a-824e-660b4e4f8b98" (UID: "1a8ee675-335d-420a-824e-660b4e4f8b98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.455583 4911 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.455627 4911 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.455643 4911 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.455655 4911 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a8ee675-335d-420a-824e-660b4e4f8b98-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.455669 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjpp7\" (UniqueName: \"kubernetes.io/projected/1a8ee675-335d-420a-824e-660b4e4f8b98-kube-api-access-xjpp7\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.455687 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.455698 4911 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.470217 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-config-data" (OuterVolumeSpecName: "config-data") pod "1a8ee675-335d-420a-824e-660b4e4f8b98" (UID: "1a8ee675-335d-420a-824e-660b4e4f8b98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.507961 4911 scope.go:117] "RemoveContainer" containerID="5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.531775 4911 scope.go:117] "RemoveContainer" containerID="29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.551836 4911 scope.go:117] "RemoveContainer" containerID="12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.557749 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a8ee675-335d-420a-824e-660b4e4f8b98-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.572455 4911 scope.go:117] "RemoveContainer" containerID="5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d" Mar 10 14:27:33 crc kubenswrapper[4911]: E0310 14:27:33.572927 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d\": container with ID starting with 5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d not found: ID does not exist" containerID="5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.572963 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d"} err="failed to get container status \"5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d\": rpc error: code = NotFound desc = could not find container \"5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d\": container with ID starting with 5f0c60980a16f4760447eee47d79622f1f696b94364c8cfbba532bc18318c57d not found: ID does not exist" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.572988 4911 scope.go:117] "RemoveContainer" containerID="5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9" Mar 10 14:27:33 crc kubenswrapper[4911]: E0310 14:27:33.573334 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9\": container with ID starting with 5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9 not found: ID does not exist" containerID="5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.573356 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9"} err="failed to get container status \"5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9\": rpc error: code = NotFound desc = could not find container \"5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9\": container with ID starting with 5cf704bf103564eb3afff123055fa1602c7f691fd42dbd4f450cb6f95200e6c9 not found: ID does not exist" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.573371 4911 scope.go:117] "RemoveContainer" containerID="29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1" Mar 10 14:27:33 crc kubenswrapper[4911]: E0310 14:27:33.573594 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1\": container with ID starting with 29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1 not found: ID does not exist" containerID="29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.573613 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1"} err="failed to get container status \"29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1\": rpc error: code = NotFound desc = could not find container \"29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1\": container with ID starting with 29d1aac6b9cbfb1d384788e6fe04e352876600c6e46099c56f4189fb3ac58fa1 not found: ID does not exist" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.573624 4911 scope.go:117] "RemoveContainer" containerID="12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f" Mar 10 14:27:33 crc kubenswrapper[4911]: E0310 14:27:33.573849 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f\": container with ID starting with 12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f not found: ID does not exist" containerID="12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.573871 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f"} err="failed to get container status \"12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f\": rpc error: code = NotFound desc = could not find container \"12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f\": container with ID starting with 12412bb00dfb455b6b58391d3cfeca3aee6fbeef0fd107bd7053245f9538f09f not found: ID does not exist" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.717739 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.747119 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.756032 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:27:33 crc kubenswrapper[4911]: E0310 14:27:33.756716 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="ceilometer-notification-agent" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.756879 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="ceilometer-notification-agent" Mar 10 14:27:33 crc kubenswrapper[4911]: E0310 14:27:33.756974 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="ceilometer-central-agent" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.757057 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="ceilometer-central-agent" Mar 10 14:27:33 crc kubenswrapper[4911]: E0310 14:27:33.757137 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="proxy-httpd" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.757197 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="proxy-httpd" Mar 10 14:27:33 crc kubenswrapper[4911]: E0310 14:27:33.757305 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="sg-core" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.757379 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="sg-core" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.757679 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="sg-core" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.758020 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="proxy-httpd" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.758103 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="ceilometer-central-agent" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.758178 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" containerName="ceilometer-notification-agent" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.761613 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.768313 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.768594 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.769661 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.791062 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.863853 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92bb8486-3729-4b5d-8f09-b99baf382c52-run-httpd\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.863912 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.863930 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92bb8486-3729-4b5d-8f09-b99baf382c52-log-httpd\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.863956 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4b9m\" (UniqueName: \"kubernetes.io/projected/92bb8486-3729-4b5d-8f09-b99baf382c52-kube-api-access-k4b9m\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.864027 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.864094 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-scripts\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.864119 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-config-data\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.864149 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.965549 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-scripts\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.965614 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-config-data\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.965650 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.965680 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92bb8486-3729-4b5d-8f09-b99baf382c52-run-httpd\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.965704 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.965738 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92bb8486-3729-4b5d-8f09-b99baf382c52-log-httpd\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.965765 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4b9m\" (UniqueName: \"kubernetes.io/projected/92bb8486-3729-4b5d-8f09-b99baf382c52-kube-api-access-k4b9m\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.965831 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.966938 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92bb8486-3729-4b5d-8f09-b99baf382c52-run-httpd\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.966953 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92bb8486-3729-4b5d-8f09-b99baf382c52-log-httpd\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.972940 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.972970 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-scripts\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.973395 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.976911 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-config-data\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.977194 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/92bb8486-3729-4b5d-8f09-b99baf382c52-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:33 crc kubenswrapper[4911]: I0310 14:27:33.986918 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4b9m\" (UniqueName: \"kubernetes.io/projected/92bb8486-3729-4b5d-8f09-b99baf382c52-kube-api-access-k4b9m\") pod \"ceilometer-0\" (UID: \"92bb8486-3729-4b5d-8f09-b99baf382c52\") " pod="openstack/ceilometer-0" Mar 10 14:27:34 crc kubenswrapper[4911]: I0310 14:27:34.094660 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 14:27:34 crc kubenswrapper[4911]: I0310 14:27:34.207596 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8ee675-335d-420a-824e-660b4e4f8b98" path="/var/lib/kubelet/pods/1a8ee675-335d-420a-824e-660b4e4f8b98/volumes" Mar 10 14:27:34 crc kubenswrapper[4911]: I0310 14:27:34.542279 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 14:27:34 crc kubenswrapper[4911]: I0310 14:27:34.548059 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:27:35 crc kubenswrapper[4911]: I0310 14:27:35.400283 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92bb8486-3729-4b5d-8f09-b99baf382c52","Type":"ContainerStarted","Data":"159f658dfb5060e7532ab2681dce86711b365e82e02449cd0606e0afa11a1e64"} Mar 10 14:27:35 crc kubenswrapper[4911]: I0310 14:27:35.511224 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 14:27:35 crc kubenswrapper[4911]: I0310 14:27:35.512297 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 14:27:35 crc kubenswrapper[4911]: I0310 14:27:35.522760 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 14:27:36 crc kubenswrapper[4911]: I0310 14:27:36.418367 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92bb8486-3729-4b5d-8f09-b99baf382c52","Type":"ContainerStarted","Data":"078bca83dcf1f4e1d96fa63915aaf5acf8ff40480b7d7667168531ff843d7e6e"} Mar 10 14:27:36 crc kubenswrapper[4911]: I0310 14:27:36.419048 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92bb8486-3729-4b5d-8f09-b99baf382c52","Type":"ContainerStarted","Data":"85428113b6aa2ec0233967c4cc02b2c898d4c42d1f683912d0d9120416653aa1"} Mar 10 14:27:36 crc kubenswrapper[4911]: I0310 14:27:36.424030 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 14:27:36 crc kubenswrapper[4911]: I0310 14:27:36.899781 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 14:27:36 crc kubenswrapper[4911]: I0310 14:27:36.900681 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 14:27:36 crc kubenswrapper[4911]: I0310 14:27:36.900789 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 14:27:36 crc kubenswrapper[4911]: I0310 14:27:36.906260 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 14:27:37 crc kubenswrapper[4911]: I0310 14:27:37.441031 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92bb8486-3729-4b5d-8f09-b99baf382c52","Type":"ContainerStarted","Data":"da25028432b351c3a89fdbfb83b4e2fdc20561b28a195159ff3bae3f3803450d"} Mar 10 14:27:37 crc kubenswrapper[4911]: I0310 14:27:37.442231 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 14:27:37 crc kubenswrapper[4911]: I0310 14:27:37.465966 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 14:27:39 crc kubenswrapper[4911]: I0310 14:27:39.467497 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92bb8486-3729-4b5d-8f09-b99baf382c52","Type":"ContainerStarted","Data":"ef19549ebcc03f3edaf5b05293652aefd1c4a23eeddfc3835ee8169ed0f338c9"} Mar 10 14:27:39 crc kubenswrapper[4911]: I0310 14:27:39.509041 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9524745060000002 podStartE2EDuration="6.509013013s" podCreationTimestamp="2026-03-10 14:27:33 +0000 UTC" firstStartedPulling="2026-03-10 14:27:34.547796359 +0000 UTC m=+1559.111316266" lastFinishedPulling="2026-03-10 14:27:39.104334856 +0000 UTC m=+1563.667854773" observedRunningTime="2026-03-10 14:27:39.498242268 +0000 UTC m=+1564.061762185" watchObservedRunningTime="2026-03-10 14:27:39.509013013 +0000 UTC m=+1564.072532930" Mar 10 14:27:40 crc kubenswrapper[4911]: I0310 14:27:40.479166 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 14:27:48 crc kubenswrapper[4911]: I0310 14:27:48.521108 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:27:48 crc kubenswrapper[4911]: I0310 14:27:48.521926 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.165458 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552548-4fdgg"] Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.169348 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552548-4fdgg" Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.172483 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.172978 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.175337 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.176394 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552548-4fdgg"] Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.225247 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqr5n\" (UniqueName: \"kubernetes.io/projected/0d8f8654-0a72-4998-822d-7fd5d24a487e-kube-api-access-lqr5n\") pod \"auto-csr-approver-29552548-4fdgg\" (UID: \"0d8f8654-0a72-4998-822d-7fd5d24a487e\") " pod="openshift-infra/auto-csr-approver-29552548-4fdgg" Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.328335 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqr5n\" (UniqueName: \"kubernetes.io/projected/0d8f8654-0a72-4998-822d-7fd5d24a487e-kube-api-access-lqr5n\") pod \"auto-csr-approver-29552548-4fdgg\" (UID: \"0d8f8654-0a72-4998-822d-7fd5d24a487e\") " pod="openshift-infra/auto-csr-approver-29552548-4fdgg" Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.349537 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqr5n\" (UniqueName: \"kubernetes.io/projected/0d8f8654-0a72-4998-822d-7fd5d24a487e-kube-api-access-lqr5n\") pod \"auto-csr-approver-29552548-4fdgg\" (UID: \"0d8f8654-0a72-4998-822d-7fd5d24a487e\") " pod="openshift-infra/auto-csr-approver-29552548-4fdgg" Mar 10 14:28:00 crc kubenswrapper[4911]: I0310 14:28:00.504091 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552548-4fdgg" Mar 10 14:28:01 crc kubenswrapper[4911]: I0310 14:28:01.033683 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552548-4fdgg"] Mar 10 14:28:01 crc kubenswrapper[4911]: I0310 14:28:01.733815 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552548-4fdgg" event={"ID":"0d8f8654-0a72-4998-822d-7fd5d24a487e","Type":"ContainerStarted","Data":"c8a62e8945ae8612cf425028f268a417244104c8199a2bab40cce1eac8c613f4"} Mar 10 14:28:02 crc kubenswrapper[4911]: I0310 14:28:02.746671 4911 generic.go:334] "Generic (PLEG): container finished" podID="0d8f8654-0a72-4998-822d-7fd5d24a487e" containerID="6ac24e83b1bcafb3912912c9a8d36f42c0cd46cc3cea4ee86def16e037d476d7" exitCode=0 Mar 10 14:28:02 crc kubenswrapper[4911]: I0310 14:28:02.746779 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552548-4fdgg" event={"ID":"0d8f8654-0a72-4998-822d-7fd5d24a487e","Type":"ContainerDied","Data":"6ac24e83b1bcafb3912912c9a8d36f42c0cd46cc3cea4ee86def16e037d476d7"} Mar 10 14:28:04 crc kubenswrapper[4911]: I0310 14:28:04.104047 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 14:28:04 crc kubenswrapper[4911]: I0310 14:28:04.149048 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552548-4fdgg" Mar 10 14:28:04 crc kubenswrapper[4911]: I0310 14:28:04.227157 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqr5n\" (UniqueName: \"kubernetes.io/projected/0d8f8654-0a72-4998-822d-7fd5d24a487e-kube-api-access-lqr5n\") pod \"0d8f8654-0a72-4998-822d-7fd5d24a487e\" (UID: \"0d8f8654-0a72-4998-822d-7fd5d24a487e\") " Mar 10 14:28:04 crc kubenswrapper[4911]: I0310 14:28:04.237391 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8f8654-0a72-4998-822d-7fd5d24a487e-kube-api-access-lqr5n" (OuterVolumeSpecName: "kube-api-access-lqr5n") pod "0d8f8654-0a72-4998-822d-7fd5d24a487e" (UID: "0d8f8654-0a72-4998-822d-7fd5d24a487e"). InnerVolumeSpecName "kube-api-access-lqr5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:04 crc kubenswrapper[4911]: I0310 14:28:04.330632 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqr5n\" (UniqueName: \"kubernetes.io/projected/0d8f8654-0a72-4998-822d-7fd5d24a487e-kube-api-access-lqr5n\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:04 crc kubenswrapper[4911]: I0310 14:28:04.769218 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552548-4fdgg" event={"ID":"0d8f8654-0a72-4998-822d-7fd5d24a487e","Type":"ContainerDied","Data":"c8a62e8945ae8612cf425028f268a417244104c8199a2bab40cce1eac8c613f4"} Mar 10 14:28:04 crc kubenswrapper[4911]: I0310 14:28:04.769587 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a62e8945ae8612cf425028f268a417244104c8199a2bab40cce1eac8c613f4" Mar 10 14:28:04 crc kubenswrapper[4911]: I0310 14:28:04.769377 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552548-4fdgg" Mar 10 14:28:05 crc kubenswrapper[4911]: I0310 14:28:05.237348 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552542-f989r"] Mar 10 14:28:05 crc kubenswrapper[4911]: I0310 14:28:05.247969 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552542-f989r"] Mar 10 14:28:06 crc kubenswrapper[4911]: I0310 14:28:06.224783 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5101cda8-039d-4775-81dc-0923e9e3e089" path="/var/lib/kubelet/pods/5101cda8-039d-4775-81dc-0923e9e3e089/volumes" Mar 10 14:28:13 crc kubenswrapper[4911]: I0310 14:28:13.989639 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:28:15 crc kubenswrapper[4911]: I0310 14:28:15.079468 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.521153 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.521693 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.521763 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.522639 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.522717 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" gracePeriod=600 Mar 10 14:28:18 crc kubenswrapper[4911]: E0310 14:28:18.647506 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.771510 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerName="rabbitmq" containerID="cri-o://debb01c86ed7af6a4695794fce229ff3866fbbfc081c0a593c3677405f2a6730" gracePeriod=604796 Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.932638 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" exitCode=0 Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.932698 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c"} Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.932789 4911 scope.go:117] "RemoveContainer" containerID="19d28b4207c776d043f5f0d2450f7371800625af7e9dbf7c4bc17586e1f99a7f" Mar 10 14:28:18 crc kubenswrapper[4911]: I0310 14:28:18.933929 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:28:18 crc kubenswrapper[4911]: E0310 14:28:18.934286 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:28:19 crc kubenswrapper[4911]: I0310 14:28:19.773228 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerName="rabbitmq" containerID="cri-o://eee414a09d2e6bbf0705352ff16ac4aa998f7c1ace07d1bc528a4da897d60418" gracePeriod=604796 Mar 10 14:28:21 crc kubenswrapper[4911]: I0310 14:28:21.735715 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 10 14:28:22 crc kubenswrapper[4911]: I0310 14:28:22.139973 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.008393 4911 generic.go:334] "Generic (PLEG): container finished" podID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerID="debb01c86ed7af6a4695794fce229ff3866fbbfc081c0a593c3677405f2a6730" exitCode=0 Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.009074 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745","Type":"ContainerDied","Data":"debb01c86ed7af6a4695794fce229ff3866fbbfc081c0a593c3677405f2a6730"} Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.411440 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.551088 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-confd\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.551485 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.551625 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-config-data\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.551848 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgpfh\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-kube-api-access-jgpfh\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.552048 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-pod-info\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.552161 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-plugins\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.552306 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-erlang-cookie-secret\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.552453 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-tls\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.552580 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-plugins-conf\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.552695 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-erlang-cookie\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.552895 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-server-conf\") pod \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\" (UID: \"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745\") " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.555442 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.555768 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.556187 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.558600 4911 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.558650 4911 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.558669 4911 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.561964 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-kube-api-access-jgpfh" (OuterVolumeSpecName: "kube-api-access-jgpfh") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "kube-api-access-jgpfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.564209 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.566667 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.567212 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-pod-info" (OuterVolumeSpecName: "pod-info") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.574765 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.597581 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-config-data" (OuterVolumeSpecName: "config-data") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.646964 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-server-conf" (OuterVolumeSpecName: "server-conf") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.660686 4911 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.660767 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.660783 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgpfh\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-kube-api-access-jgpfh\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.660796 4911 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.661000 4911 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.661011 4911 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.661022 4911 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.702126 4911 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.702602 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" (UID: "afa5978d-b0b8-4edb-b3ca-27b7bb1ee745"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.762861 4911 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:25 crc kubenswrapper[4911]: I0310 14:28:25.762897 4911 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.045197 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afa5978d-b0b8-4edb-b3ca-27b7bb1ee745","Type":"ContainerDied","Data":"2ff5d36b7e7aa0284f0d4da1fca571dccca2f9f1fee62e228da3d6186521c29a"} Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.045289 4911 scope.go:117] "RemoveContainer" containerID="debb01c86ed7af6a4695794fce229ff3866fbbfc081c0a593c3677405f2a6730" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.045631 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.061609 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d1a8c73-283d-431f-bfd3-af06ca3c60ff","Type":"ContainerDied","Data":"eee414a09d2e6bbf0705352ff16ac4aa998f7c1ace07d1bc528a4da897d60418"} Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.061625 4911 generic.go:334] "Generic (PLEG): container finished" podID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerID="eee414a09d2e6bbf0705352ff16ac4aa998f7c1ace07d1bc528a4da897d60418" exitCode=0 Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.112682 4911 scope.go:117] "RemoveContainer" containerID="6a0ccf7ab32bbdd3db16a7cb4519b544bcfd41ff2a2f3f99c3689e09324d90f1" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.189587 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.238243 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.239927 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:28:26 crc kubenswrapper[4911]: E0310 14:28:26.240356 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8f8654-0a72-4998-822d-7fd5d24a487e" containerName="oc" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.240380 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8f8654-0a72-4998-822d-7fd5d24a487e" containerName="oc" Mar 10 14:28:26 crc kubenswrapper[4911]: E0310 14:28:26.240397 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerName="rabbitmq" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.240404 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerName="rabbitmq" Mar 10 14:28:26 crc kubenswrapper[4911]: E0310 14:28:26.240412 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerName="setup-container" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.240418 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerName="setup-container" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.240629 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8f8654-0a72-4998-822d-7fd5d24a487e" containerName="oc" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.240658 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" containerName="rabbitmq" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.241833 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.245597 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.245660 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.245866 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.246009 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.246131 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.246204 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-q998f" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.246278 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.273051 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.381318 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.381441 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.381802 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.381986 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.382098 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.382505 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.383030 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.383059 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km5x7\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-kube-api-access-km5x7\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.383159 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-config-data\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.383211 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.383242 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.462953 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485487 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485592 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485643 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485663 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km5x7\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-kube-api-access-km5x7\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485698 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-config-data\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485739 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485770 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485832 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485860 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485911 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.485959 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.486074 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.486305 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.487198 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.487377 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.493394 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.494501 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.497672 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.498576 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-config-data\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.507694 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.511462 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.535629 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km5x7\" (UniqueName: \"kubernetes.io/projected/85cb7ff2-e47f-46ad-a30d-6442c0fde95f-kube-api-access-km5x7\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.567764 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"85cb7ff2-e47f-46ad-a30d-6442c0fde95f\") " pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587324 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kr66\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-kube-api-access-2kr66\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587432 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-server-conf\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587458 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587491 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-plugins-conf\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587554 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-config-data\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587579 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-confd\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587630 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-tls\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587677 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-erlang-cookie-secret\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587792 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-erlang-cookie\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587825 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-plugins\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.587892 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-pod-info\") pod \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\" (UID: \"8d1a8c73-283d-431f-bfd3-af06ca3c60ff\") " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.589756 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.590231 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.591298 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.593941 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.595563 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.598838 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-kube-api-access-2kr66" (OuterVolumeSpecName: "kube-api-access-2kr66") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "kube-api-access-2kr66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.604231 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-pod-info" (OuterVolumeSpecName: "pod-info") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.604269 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.629061 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-config-data" (OuterVolumeSpecName: "config-data") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.650249 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-server-conf" (OuterVolumeSpecName: "server-conf") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691216 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kr66\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-kube-api-access-2kr66\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691555 4911 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691595 4911 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691607 4911 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691616 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691626 4911 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691637 4911 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691648 4911 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691657 4911 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.691669 4911 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.695050 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8d1a8c73-283d-431f-bfd3-af06ca3c60ff" (UID: "8d1a8c73-283d-431f-bfd3-af06ca3c60ff"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.712356 4911 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.794150 4911 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.794189 4911 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d1a8c73-283d-431f-bfd3-af06ca3c60ff-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.825153 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-pd9km"] Mar 10 14:28:26 crc kubenswrapper[4911]: E0310 14:28:26.825651 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerName="rabbitmq" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.825671 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerName="rabbitmq" Mar 10 14:28:26 crc kubenswrapper[4911]: E0310 14:28:26.825708 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerName="setup-container" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.825716 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerName="setup-container" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.826007 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" containerName="rabbitmq" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.827149 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.845254 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.853188 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-pd9km"] Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.868002 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.896462 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.896538 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.896629 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tqhp\" (UniqueName: \"kubernetes.io/projected/9769d997-104e-479d-a7b8-a81a1d63cab7-kube-api-access-4tqhp\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.896665 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.896713 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-config\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.896766 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.896796 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.998475 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tqhp\" (UniqueName: \"kubernetes.io/projected/9769d997-104e-479d-a7b8-a81a1d63cab7-kube-api-access-4tqhp\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.998543 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.998599 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-config\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.998653 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.998678 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.998696 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:26 crc kubenswrapper[4911]: I0310 14:28:26.998738 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.000497 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.000522 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.000560 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.000753 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.001280 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-config\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.001811 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.021009 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tqhp\" (UniqueName: \"kubernetes.io/projected/9769d997-104e-479d-a7b8-a81a1d63cab7-kube-api-access-4tqhp\") pod \"dnsmasq-dns-79bd4cc8c9-pd9km\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.086163 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d1a8c73-283d-431f-bfd3-af06ca3c60ff","Type":"ContainerDied","Data":"7df7970bd92215495a3ffe6976980013f2c6be864288ef8501b7654a7af92992"} Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.086245 4911 scope.go:117] "RemoveContainer" containerID="eee414a09d2e6bbf0705352ff16ac4aa998f7c1ace07d1bc528a4da897d60418" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.086513 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.116230 4911 scope.go:117] "RemoveContainer" containerID="59082a05080778cdaeafa7755b2401054688c324364ece5a1d2c398ee3e47500" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.155245 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.155852 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.203684 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.221083 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.224218 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.226896 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.227027 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.227390 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.227533 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.227569 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.227606 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.227709 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.228897 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s2nqt" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.304683 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.304734 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.304791 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.305006 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.305038 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.305083 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0480ed86-7666-490a-9cd0-78a5ba05dac7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.305123 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.305147 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.305166 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxkmm\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-kube-api-access-mxkmm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.305231 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.305251 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0480ed86-7666-490a-9cd0-78a5ba05dac7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.371837 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410569 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410615 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0480ed86-7666-490a-9cd0-78a5ba05dac7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410678 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410702 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410751 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410831 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410854 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410881 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0480ed86-7666-490a-9cd0-78a5ba05dac7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410908 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410932 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.410948 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxkmm\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-kube-api-access-mxkmm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.413702 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.414201 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.416768 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.417876 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.418601 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.418664 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0480ed86-7666-490a-9cd0-78a5ba05dac7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.421125 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.422188 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.427799 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0480ed86-7666-490a-9cd0-78a5ba05dac7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.429283 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0480ed86-7666-490a-9cd0-78a5ba05dac7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.432512 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxkmm\" (UniqueName: \"kubernetes.io/projected/0480ed86-7666-490a-9cd0-78a5ba05dac7-kube-api-access-mxkmm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.464873 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0480ed86-7666-490a-9cd0-78a5ba05dac7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.634151 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:28:27 crc kubenswrapper[4911]: I0310 14:28:27.742021 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-pd9km"] Mar 10 14:28:28 crc kubenswrapper[4911]: I0310 14:28:28.098577 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85cb7ff2-e47f-46ad-a30d-6442c0fde95f","Type":"ContainerStarted","Data":"7a50aedad411b4ea59d6057b823ee1e17b6c2bee0a2530e275aeee2523b90730"} Mar 10 14:28:28 crc kubenswrapper[4911]: I0310 14:28:28.103550 4911 generic.go:334] "Generic (PLEG): container finished" podID="9769d997-104e-479d-a7b8-a81a1d63cab7" containerID="de169150a916066a2951956c6a5da84c34b189327b63393a0af0ab1a8c687a8e" exitCode=0 Mar 10 14:28:28 crc kubenswrapper[4911]: I0310 14:28:28.103584 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" event={"ID":"9769d997-104e-479d-a7b8-a81a1d63cab7","Type":"ContainerDied","Data":"de169150a916066a2951956c6a5da84c34b189327b63393a0af0ab1a8c687a8e"} Mar 10 14:28:28 crc kubenswrapper[4911]: I0310 14:28:28.103604 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" event={"ID":"9769d997-104e-479d-a7b8-a81a1d63cab7","Type":"ContainerStarted","Data":"54cd93ab2a1fee9eb4f3b42e7ba87d569094a4853aa28d00fbf2a48b0714eff2"} Mar 10 14:28:28 crc kubenswrapper[4911]: I0310 14:28:28.175134 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 14:28:28 crc kubenswrapper[4911]: W0310 14:28:28.186110 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0480ed86_7666_490a_9cd0_78a5ba05dac7.slice/crio-5ef35389bc6a7cf29368bcfe7c8f63fa518b9de0d5a71c43097d250faca1f8ac WatchSource:0}: Error finding container 5ef35389bc6a7cf29368bcfe7c8f63fa518b9de0d5a71c43097d250faca1f8ac: Status 404 returned error can't find the container with id 5ef35389bc6a7cf29368bcfe7c8f63fa518b9de0d5a71c43097d250faca1f8ac Mar 10 14:28:28 crc kubenswrapper[4911]: I0310 14:28:28.211063 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1a8c73-283d-431f-bfd3-af06ca3c60ff" path="/var/lib/kubelet/pods/8d1a8c73-283d-431f-bfd3-af06ca3c60ff/volumes" Mar 10 14:28:28 crc kubenswrapper[4911]: I0310 14:28:28.212602 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa5978d-b0b8-4edb-b3ca-27b7bb1ee745" path="/var/lib/kubelet/pods/afa5978d-b0b8-4edb-b3ca-27b7bb1ee745/volumes" Mar 10 14:28:29 crc kubenswrapper[4911]: I0310 14:28:29.994532 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:28:29 crc kubenswrapper[4911]: E0310 14:28:29.995314 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:28:30 crc kubenswrapper[4911]: I0310 14:28:30.037746 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0480ed86-7666-490a-9cd0-78a5ba05dac7","Type":"ContainerStarted","Data":"5ef35389bc6a7cf29368bcfe7c8f63fa518b9de0d5a71c43097d250faca1f8ac"} Mar 10 14:28:31 crc kubenswrapper[4911]: I0310 14:28:31.049838 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" event={"ID":"9769d997-104e-479d-a7b8-a81a1d63cab7","Type":"ContainerStarted","Data":"0ceae9311e59780c3bcd95051e6edff2a1449816e7f808e519439fd1c6b30bfe"} Mar 10 14:28:31 crc kubenswrapper[4911]: I0310 14:28:31.055579 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:31 crc kubenswrapper[4911]: I0310 14:28:31.056373 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85cb7ff2-e47f-46ad-a30d-6442c0fde95f","Type":"ContainerStarted","Data":"87dc467714320ffb4fc727b83a672510552b6bb67c3758f4ed00071cd132111b"} Mar 10 14:28:31 crc kubenswrapper[4911]: I0310 14:28:31.057805 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0480ed86-7666-490a-9cd0-78a5ba05dac7","Type":"ContainerStarted","Data":"9e38e7a6eb67ecaa46d7217d484e5f028b8fec3b8786d13fe6c24f1766d4ed09"} Mar 10 14:28:31 crc kubenswrapper[4911]: I0310 14:28:31.080454 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" podStartSLOduration=5.080432396 podStartE2EDuration="5.080432396s" podCreationTimestamp="2026-03-10 14:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:28:31.076613915 +0000 UTC m=+1615.640133842" watchObservedRunningTime="2026-03-10 14:28:31.080432396 +0000 UTC m=+1615.643952313" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.158005 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.307900 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4rtt8"] Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.308251 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" podUID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" containerName="dnsmasq-dns" containerID="cri-o://d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75" gracePeriod=10 Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.459968 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-q56d9"] Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.462456 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.481937 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-q56d9"] Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.581190 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.581542 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.581657 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.581680 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-dns-svc\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.581906 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-config\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.582047 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85sdh\" (UniqueName: \"kubernetes.io/projected/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-kube-api-access-85sdh\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.582200 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.685160 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.685223 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.685324 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.685350 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-dns-svc\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.685419 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-config\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.685474 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85sdh\" (UniqueName: \"kubernetes.io/projected/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-kube-api-access-85sdh\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.685594 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.686678 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.686848 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-config\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.686897 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-dns-svc\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.686949 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.687060 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.687663 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.709079 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85sdh\" (UniqueName: \"kubernetes.io/projected/ec6c0ebc-e82c-4981-a32c-8ee98d9496ec-kube-api-access-85sdh\") pod \"dnsmasq-dns-55478c4467-q56d9\" (UID: \"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec\") " pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.803254 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:37 crc kubenswrapper[4911]: I0310 14:28:37.942658 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.095659 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-config\") pod \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.095926 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-swift-storage-0\") pod \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.096012 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-nb\") pod \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.096083 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6drz\" (UniqueName: \"kubernetes.io/projected/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-kube-api-access-k6drz\") pod \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.096189 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-sb\") pod \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.096905 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-svc\") pod \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\" (UID: \"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6\") " Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.100704 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-kube-api-access-k6drz" (OuterVolumeSpecName: "kube-api-access-k6drz") pod "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" (UID: "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6"). InnerVolumeSpecName "kube-api-access-k6drz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.135062 4911 generic.go:334] "Generic (PLEG): container finished" podID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" containerID="d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75" exitCode=0 Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.135351 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" event={"ID":"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6","Type":"ContainerDied","Data":"d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75"} Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.135446 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" event={"ID":"bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6","Type":"ContainerDied","Data":"af31946cff891d8791b97294696719dcaaf6c580c82d93d5ff2c56b61e40cc21"} Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.135511 4911 scope.go:117] "RemoveContainer" containerID="d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.135711 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4rtt8" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.156110 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" (UID: "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.156114 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" (UID: "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.160913 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" (UID: "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.170815 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-config" (OuterVolumeSpecName: "config") pod "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" (UID: "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.171252 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" (UID: "bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.184881 4911 scope.go:117] "RemoveContainer" containerID="1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.199901 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.199936 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.199947 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.199958 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.199968 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.199976 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6drz\" (UniqueName: \"kubernetes.io/projected/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6-kube-api-access-k6drz\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.211280 4911 scope.go:117] "RemoveContainer" containerID="d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75" Mar 10 14:28:38 crc kubenswrapper[4911]: E0310 14:28:38.212103 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75\": container with ID starting with d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75 not found: ID does not exist" containerID="d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.212179 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75"} err="failed to get container status \"d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75\": rpc error: code = NotFound desc = could not find container \"d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75\": container with ID starting with d11040fb0d31ffd807a42a6501916574a5aa90bd8e041ada27e816502c89ee75 not found: ID does not exist" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.212225 4911 scope.go:117] "RemoveContainer" containerID="1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d" Mar 10 14:28:38 crc kubenswrapper[4911]: E0310 14:28:38.212627 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d\": container with ID starting with 1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d not found: ID does not exist" containerID="1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.212663 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d"} err="failed to get container status \"1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d\": rpc error: code = NotFound desc = could not find container \"1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d\": container with ID starting with 1235030ca06dc008f8fe24c844a11cb0bd0307b665a41a7b84feacbf6fb4641d not found: ID does not exist" Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.350712 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-q56d9"] Mar 10 14:28:38 crc kubenswrapper[4911]: W0310 14:28:38.354601 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec6c0ebc_e82c_4981_a32c_8ee98d9496ec.slice/crio-c621d1100d898da2fbd7fd0b9b08c586fe2151a4eb49ddfbc662ded51a4d92aa WatchSource:0}: Error finding container c621d1100d898da2fbd7fd0b9b08c586fe2151a4eb49ddfbc662ded51a4d92aa: Status 404 returned error can't find the container with id c621d1100d898da2fbd7fd0b9b08c586fe2151a4eb49ddfbc662ded51a4d92aa Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.606684 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4rtt8"] Mar 10 14:28:38 crc kubenswrapper[4911]: I0310 14:28:38.619355 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4rtt8"] Mar 10 14:28:39 crc kubenswrapper[4911]: I0310 14:28:39.147218 4911 generic.go:334] "Generic (PLEG): container finished" podID="ec6c0ebc-e82c-4981-a32c-8ee98d9496ec" containerID="a142ec8b5e8e6f690aca5bb3d1d43f35d7a32a20b6811e3acde5c0a47a3da893" exitCode=0 Mar 10 14:28:39 crc kubenswrapper[4911]: I0310 14:28:39.147308 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-q56d9" event={"ID":"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec","Type":"ContainerDied","Data":"a142ec8b5e8e6f690aca5bb3d1d43f35d7a32a20b6811e3acde5c0a47a3da893"} Mar 10 14:28:39 crc kubenswrapper[4911]: I0310 14:28:39.147356 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-q56d9" event={"ID":"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec","Type":"ContainerStarted","Data":"c621d1100d898da2fbd7fd0b9b08c586fe2151a4eb49ddfbc662ded51a4d92aa"} Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.163322 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-q56d9" event={"ID":"ec6c0ebc-e82c-4981-a32c-8ee98d9496ec","Type":"ContainerStarted","Data":"24244e3d6189a95aa03e8f99390c515b9ed1f7b2ae1b3ebae276d4a96dfc87c4"} Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.163658 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.188940 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-q56d9" podStartSLOduration=3.18891681 podStartE2EDuration="3.18891681s" podCreationTimestamp="2026-03-10 14:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:28:40.184496833 +0000 UTC m=+1624.748016750" watchObservedRunningTime="2026-03-10 14:28:40.18891681 +0000 UTC m=+1624.752436727" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.218156 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" path="/var/lib/kubelet/pods/bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6/volumes" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.637703 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h4htt"] Mar 10 14:28:40 crc kubenswrapper[4911]: E0310 14:28:40.638771 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" containerName="dnsmasq-dns" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.638796 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" containerName="dnsmasq-dns" Mar 10 14:28:40 crc kubenswrapper[4911]: E0310 14:28:40.638831 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" containerName="init" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.638840 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" containerName="init" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.639095 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf3bdce-cb20-4b34-8b77-dbcfd3f8aff6" containerName="dnsmasq-dns" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.640837 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.648087 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h4htt"] Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.663635 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-utilities\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.663710 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz28s\" (UniqueName: \"kubernetes.io/projected/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-kube-api-access-qz28s\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.664566 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-catalog-content\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.766661 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-catalog-content\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.767166 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-catalog-content\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.767330 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-utilities\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.767398 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz28s\" (UniqueName: \"kubernetes.io/projected/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-kube-api-access-qz28s\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.768079 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-utilities\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:40 crc kubenswrapper[4911]: I0310 14:28:40.794301 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz28s\" (UniqueName: \"kubernetes.io/projected/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-kube-api-access-qz28s\") pod \"community-operators-h4htt\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:41 crc kubenswrapper[4911]: I0310 14:28:41.002106 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:41 crc kubenswrapper[4911]: I0310 14:28:41.583800 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h4htt"] Mar 10 14:28:42 crc kubenswrapper[4911]: I0310 14:28:42.211129 4911 generic.go:334] "Generic (PLEG): container finished" podID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerID="c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60" exitCode=0 Mar 10 14:28:42 crc kubenswrapper[4911]: I0310 14:28:42.211462 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4htt" event={"ID":"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c","Type":"ContainerDied","Data":"c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60"} Mar 10 14:28:42 crc kubenswrapper[4911]: I0310 14:28:42.211673 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4htt" event={"ID":"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c","Type":"ContainerStarted","Data":"bbe448a6b8fdd8cba060d82b8cd2ead9e4613c05972b054446a6e03f53329c40"} Mar 10 14:28:43 crc kubenswrapper[4911]: I0310 14:28:43.193248 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:28:43 crc kubenswrapper[4911]: E0310 14:28:43.193918 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:28:44 crc kubenswrapper[4911]: I0310 14:28:44.248931 4911 generic.go:334] "Generic (PLEG): container finished" podID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerID="b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11" exitCode=0 Mar 10 14:28:44 crc kubenswrapper[4911]: I0310 14:28:44.249002 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4htt" event={"ID":"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c","Type":"ContainerDied","Data":"b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11"} Mar 10 14:28:45 crc kubenswrapper[4911]: I0310 14:28:45.262750 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4htt" event={"ID":"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c","Type":"ContainerStarted","Data":"3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108"} Mar 10 14:28:45 crc kubenswrapper[4911]: I0310 14:28:45.287021 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h4htt" podStartSLOduration=2.6777442000000002 podStartE2EDuration="5.286996049s" podCreationTimestamp="2026-03-10 14:28:40 +0000 UTC" firstStartedPulling="2026-03-10 14:28:42.213899944 +0000 UTC m=+1626.777419901" lastFinishedPulling="2026-03-10 14:28:44.823151843 +0000 UTC m=+1629.386671750" observedRunningTime="2026-03-10 14:28:45.281695809 +0000 UTC m=+1629.845215756" watchObservedRunningTime="2026-03-10 14:28:45.286996049 +0000 UTC m=+1629.850515966" Mar 10 14:28:47 crc kubenswrapper[4911]: I0310 14:28:47.806134 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-q56d9" Mar 10 14:28:47 crc kubenswrapper[4911]: I0310 14:28:47.890810 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-pd9km"] Mar 10 14:28:47 crc kubenswrapper[4911]: I0310 14:28:47.891164 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" podUID="9769d997-104e-479d-a7b8-a81a1d63cab7" containerName="dnsmasq-dns" containerID="cri-o://0ceae9311e59780c3bcd95051e6edff2a1449816e7f808e519439fd1c6b30bfe" gracePeriod=10 Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.304858 4911 generic.go:334] "Generic (PLEG): container finished" podID="9769d997-104e-479d-a7b8-a81a1d63cab7" containerID="0ceae9311e59780c3bcd95051e6edff2a1449816e7f808e519439fd1c6b30bfe" exitCode=0 Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.305266 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" event={"ID":"9769d997-104e-479d-a7b8-a81a1d63cab7","Type":"ContainerDied","Data":"0ceae9311e59780c3bcd95051e6edff2a1449816e7f808e519439fd1c6b30bfe"} Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.454924 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.568466 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tqhp\" (UniqueName: \"kubernetes.io/projected/9769d997-104e-479d-a7b8-a81a1d63cab7-kube-api-access-4tqhp\") pod \"9769d997-104e-479d-a7b8-a81a1d63cab7\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.568530 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-config\") pod \"9769d997-104e-479d-a7b8-a81a1d63cab7\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.568556 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-swift-storage-0\") pod \"9769d997-104e-479d-a7b8-a81a1d63cab7\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.568591 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-sb\") pod \"9769d997-104e-479d-a7b8-a81a1d63cab7\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.568618 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-nb\") pod \"9769d997-104e-479d-a7b8-a81a1d63cab7\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.568751 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-svc\") pod \"9769d997-104e-479d-a7b8-a81a1d63cab7\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.568820 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-openstack-edpm-ipam\") pod \"9769d997-104e-479d-a7b8-a81a1d63cab7\" (UID: \"9769d997-104e-479d-a7b8-a81a1d63cab7\") " Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.579241 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9769d997-104e-479d-a7b8-a81a1d63cab7-kube-api-access-4tqhp" (OuterVolumeSpecName: "kube-api-access-4tqhp") pod "9769d997-104e-479d-a7b8-a81a1d63cab7" (UID: "9769d997-104e-479d-a7b8-a81a1d63cab7"). InnerVolumeSpecName "kube-api-access-4tqhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.626144 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "9769d997-104e-479d-a7b8-a81a1d63cab7" (UID: "9769d997-104e-479d-a7b8-a81a1d63cab7"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.628274 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-config" (OuterVolumeSpecName: "config") pod "9769d997-104e-479d-a7b8-a81a1d63cab7" (UID: "9769d997-104e-479d-a7b8-a81a1d63cab7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.629672 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9769d997-104e-479d-a7b8-a81a1d63cab7" (UID: "9769d997-104e-479d-a7b8-a81a1d63cab7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.629674 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9769d997-104e-479d-a7b8-a81a1d63cab7" (UID: "9769d997-104e-479d-a7b8-a81a1d63cab7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.642374 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9769d997-104e-479d-a7b8-a81a1d63cab7" (UID: "9769d997-104e-479d-a7b8-a81a1d63cab7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.653239 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9769d997-104e-479d-a7b8-a81a1d63cab7" (UID: "9769d997-104e-479d-a7b8-a81a1d63cab7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.672837 4911 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.672891 4911 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.672909 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tqhp\" (UniqueName: \"kubernetes.io/projected/9769d997-104e-479d-a7b8-a81a1d63cab7-kube-api-access-4tqhp\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.672924 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-config\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.672936 4911 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.672948 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:48 crc kubenswrapper[4911]: I0310 14:28:48.672960 4911 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9769d997-104e-479d-a7b8-a81a1d63cab7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:49 crc kubenswrapper[4911]: I0310 14:28:49.326193 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" event={"ID":"9769d997-104e-479d-a7b8-a81a1d63cab7","Type":"ContainerDied","Data":"54cd93ab2a1fee9eb4f3b42e7ba87d569094a4853aa28d00fbf2a48b0714eff2"} Mar 10 14:28:49 crc kubenswrapper[4911]: I0310 14:28:49.326266 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-pd9km" Mar 10 14:28:49 crc kubenswrapper[4911]: I0310 14:28:49.326744 4911 scope.go:117] "RemoveContainer" containerID="0ceae9311e59780c3bcd95051e6edff2a1449816e7f808e519439fd1c6b30bfe" Mar 10 14:28:49 crc kubenswrapper[4911]: I0310 14:28:49.370487 4911 scope.go:117] "RemoveContainer" containerID="de169150a916066a2951956c6a5da84c34b189327b63393a0af0ab1a8c687a8e" Mar 10 14:28:49 crc kubenswrapper[4911]: I0310 14:28:49.385625 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-pd9km"] Mar 10 14:28:49 crc kubenswrapper[4911]: I0310 14:28:49.402190 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-pd9km"] Mar 10 14:28:50 crc kubenswrapper[4911]: I0310 14:28:50.208488 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9769d997-104e-479d-a7b8-a81a1d63cab7" path="/var/lib/kubelet/pods/9769d997-104e-479d-a7b8-a81a1d63cab7/volumes" Mar 10 14:28:51 crc kubenswrapper[4911]: I0310 14:28:51.003390 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:51 crc kubenswrapper[4911]: I0310 14:28:51.003483 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:51 crc kubenswrapper[4911]: I0310 14:28:51.059574 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:51 crc kubenswrapper[4911]: I0310 14:28:51.400670 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:51 crc kubenswrapper[4911]: I0310 14:28:51.449985 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h4htt"] Mar 10 14:28:52 crc kubenswrapper[4911]: I0310 14:28:52.637199 4911 scope.go:117] "RemoveContainer" containerID="75a1fa23f4e3cfb65b887bc4bbe3566742e62d45a418da5164eb20a5fde2529c" Mar 10 14:28:52 crc kubenswrapper[4911]: I0310 14:28:52.767973 4911 scope.go:117] "RemoveContainer" containerID="fd545a046f233cba0f5f1b9a0d3b1abc53c9050872947bbd13a4722c17fdb802" Mar 10 14:28:52 crc kubenswrapper[4911]: I0310 14:28:52.884469 4911 scope.go:117] "RemoveContainer" containerID="6b0ebb6cc7e4572fdfb2020007d6495d715b0e01cbc5c9fc28ec757fb61c0de8" Mar 10 14:28:52 crc kubenswrapper[4911]: I0310 14:28:52.913711 4911 scope.go:117] "RemoveContainer" containerID="f495a1455781098ed5d3ce1cc62156705490a948b0e4ac25a4fd8e17b20971da" Mar 10 14:28:52 crc kubenswrapper[4911]: I0310 14:28:52.966959 4911 scope.go:117] "RemoveContainer" containerID="fddef34e1538307a3db7a9a01521f575c59c430239c11ffc64af7724dfe4fbca" Mar 10 14:28:53 crc kubenswrapper[4911]: I0310 14:28:53.374995 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h4htt" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerName="registry-server" containerID="cri-o://3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108" gracePeriod=2 Mar 10 14:28:53 crc kubenswrapper[4911]: E0310 14:28:53.633431 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbabfa2b_7dd9_43b8_8199_f75e7ce5346c.slice/crio-3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108.scope\": RecentStats: unable to find data in memory cache]" Mar 10 14:28:53 crc kubenswrapper[4911]: I0310 14:28:53.845376 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.005770 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-catalog-content\") pod \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.006068 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz28s\" (UniqueName: \"kubernetes.io/projected/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-kube-api-access-qz28s\") pod \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.006101 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-utilities\") pod \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\" (UID: \"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c\") " Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.007179 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-utilities" (OuterVolumeSpecName: "utilities") pod "bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" (UID: "bbabfa2b-7dd9-43b8-8199-f75e7ce5346c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.007314 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.011923 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-kube-api-access-qz28s" (OuterVolumeSpecName: "kube-api-access-qz28s") pod "bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" (UID: "bbabfa2b-7dd9-43b8-8199-f75e7ce5346c"). InnerVolumeSpecName "kube-api-access-qz28s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.110274 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz28s\" (UniqueName: \"kubernetes.io/projected/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-kube-api-access-qz28s\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.389866 4911 generic.go:334] "Generic (PLEG): container finished" podID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerID="3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108" exitCode=0 Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.389938 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4htt" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.389934 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4htt" event={"ID":"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c","Type":"ContainerDied","Data":"3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108"} Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.390390 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4htt" event={"ID":"bbabfa2b-7dd9-43b8-8199-f75e7ce5346c","Type":"ContainerDied","Data":"bbe448a6b8fdd8cba060d82b8cd2ead9e4613c05972b054446a6e03f53329c40"} Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.390418 4911 scope.go:117] "RemoveContainer" containerID="3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.411108 4911 scope.go:117] "RemoveContainer" containerID="b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.441881 4911 scope.go:117] "RemoveContainer" containerID="c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.481800 4911 scope.go:117] "RemoveContainer" containerID="3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108" Mar 10 14:28:54 crc kubenswrapper[4911]: E0310 14:28:54.482484 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108\": container with ID starting with 3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108 not found: ID does not exist" containerID="3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.482548 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108"} err="failed to get container status \"3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108\": rpc error: code = NotFound desc = could not find container \"3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108\": container with ID starting with 3f7c688a888b81085bf43d554202d3967e78e3e04ad03358acad0156d9fb0108 not found: ID does not exist" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.482589 4911 scope.go:117] "RemoveContainer" containerID="b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11" Mar 10 14:28:54 crc kubenswrapper[4911]: E0310 14:28:54.483004 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11\": container with ID starting with b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11 not found: ID does not exist" containerID="b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.483065 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11"} err="failed to get container status \"b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11\": rpc error: code = NotFound desc = could not find container \"b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11\": container with ID starting with b20628ba77102e90d64a18550d1b609ab42cc58650427ca848474aef0f175f11 not found: ID does not exist" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.483105 4911 scope.go:117] "RemoveContainer" containerID="c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60" Mar 10 14:28:54 crc kubenswrapper[4911]: E0310 14:28:54.483455 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60\": container with ID starting with c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60 not found: ID does not exist" containerID="c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.483487 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60"} err="failed to get container status \"c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60\": rpc error: code = NotFound desc = could not find container \"c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60\": container with ID starting with c18bc4293691685245a62beb9a8564d67c9dbedd87de1028369cfe7eb5936e60 not found: ID does not exist" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.501382 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" (UID: "bbabfa2b-7dd9-43b8-8199-f75e7ce5346c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.519394 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.756466 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h4htt"] Mar 10 14:28:54 crc kubenswrapper[4911]: I0310 14:28:54.766936 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h4htt"] Mar 10 14:28:56 crc kubenswrapper[4911]: I0310 14:28:56.205824 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:28:56 crc kubenswrapper[4911]: E0310 14:28:56.206452 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:28:56 crc kubenswrapper[4911]: I0310 14:28:56.216046 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" path="/var/lib/kubelet/pods/bbabfa2b-7dd9-43b8-8199-f75e7ce5346c/volumes" Mar 10 14:29:02 crc kubenswrapper[4911]: I0310 14:29:02.490646 4911 generic.go:334] "Generic (PLEG): container finished" podID="0480ed86-7666-490a-9cd0-78a5ba05dac7" containerID="9e38e7a6eb67ecaa46d7217d484e5f028b8fec3b8786d13fe6c24f1766d4ed09" exitCode=0 Mar 10 14:29:02 crc kubenswrapper[4911]: I0310 14:29:02.490769 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0480ed86-7666-490a-9cd0-78a5ba05dac7","Type":"ContainerDied","Data":"9e38e7a6eb67ecaa46d7217d484e5f028b8fec3b8786d13fe6c24f1766d4ed09"} Mar 10 14:29:03 crc kubenswrapper[4911]: I0310 14:29:03.505524 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0480ed86-7666-490a-9cd0-78a5ba05dac7","Type":"ContainerStarted","Data":"ee9864d9c49dd87b299696e0b8585a7e2ddc71529390810a2c0a1ac4ac5f7b3d"} Mar 10 14:29:03 crc kubenswrapper[4911]: I0310 14:29:03.508120 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:29:03 crc kubenswrapper[4911]: I0310 14:29:03.511203 4911 generic.go:334] "Generic (PLEG): container finished" podID="85cb7ff2-e47f-46ad-a30d-6442c0fde95f" containerID="87dc467714320ffb4fc727b83a672510552b6bb67c3758f4ed00071cd132111b" exitCode=0 Mar 10 14:29:03 crc kubenswrapper[4911]: I0310 14:29:03.511246 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85cb7ff2-e47f-46ad-a30d-6442c0fde95f","Type":"ContainerDied","Data":"87dc467714320ffb4fc727b83a672510552b6bb67c3758f4ed00071cd132111b"} Mar 10 14:29:03 crc kubenswrapper[4911]: I0310 14:29:03.552339 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.552313392 podStartE2EDuration="36.552313392s" podCreationTimestamp="2026-03-10 14:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:29:03.538459715 +0000 UTC m=+1648.101979652" watchObservedRunningTime="2026-03-10 14:29:03.552313392 +0000 UTC m=+1648.115833319" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.525002 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85cb7ff2-e47f-46ad-a30d-6442c0fde95f","Type":"ContainerStarted","Data":"021424e0a9f05bb3a76b91f547c4e83f1bfbbfbf859571849c077b86f9ffa317"} Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.526246 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.561689 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.561667376 podStartE2EDuration="38.561667376s" podCreationTimestamp="2026-03-10 14:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:29:04.553738416 +0000 UTC m=+1649.117258353" watchObservedRunningTime="2026-03-10 14:29:04.561667376 +0000 UTC m=+1649.125187293" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.683925 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8"] Mar 10 14:29:04 crc kubenswrapper[4911]: E0310 14:29:04.685111 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9769d997-104e-479d-a7b8-a81a1d63cab7" containerName="init" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.685243 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9769d997-104e-479d-a7b8-a81a1d63cab7" containerName="init" Mar 10 14:29:04 crc kubenswrapper[4911]: E0310 14:29:04.685393 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9769d997-104e-479d-a7b8-a81a1d63cab7" containerName="dnsmasq-dns" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.685494 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9769d997-104e-479d-a7b8-a81a1d63cab7" containerName="dnsmasq-dns" Mar 10 14:29:04 crc kubenswrapper[4911]: E0310 14:29:04.685582 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerName="registry-server" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.685661 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerName="registry-server" Mar 10 14:29:04 crc kubenswrapper[4911]: E0310 14:29:04.685786 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerName="extract-utilities" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.685889 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerName="extract-utilities" Mar 10 14:29:04 crc kubenswrapper[4911]: E0310 14:29:04.685996 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerName="extract-content" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.686081 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerName="extract-content" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.686438 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9769d997-104e-479d-a7b8-a81a1d63cab7" containerName="dnsmasq-dns" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.686548 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbabfa2b-7dd9-43b8-8199-f75e7ce5346c" containerName="registry-server" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.689482 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.694251 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.694542 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.694816 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.695285 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.702898 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8"] Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.768586 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqt4l\" (UniqueName: \"kubernetes.io/projected/3f912ff3-8e6b-4757-8708-865cb96e132e-kube-api-access-cqt4l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.768680 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.768841 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.768908 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.870513 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.870663 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.870748 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.870802 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqt4l\" (UniqueName: \"kubernetes.io/projected/3f912ff3-8e6b-4757-8708-865cb96e132e-kube-api-access-cqt4l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.877796 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.885067 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.888053 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqt4l\" (UniqueName: \"kubernetes.io/projected/3f912ff3-8e6b-4757-8708-865cb96e132e-kube-api-access-cqt4l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:04 crc kubenswrapper[4911]: I0310 14:29:04.896891 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:05 crc kubenswrapper[4911]: I0310 14:29:05.024372 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:05 crc kubenswrapper[4911]: I0310 14:29:05.686019 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8"] Mar 10 14:29:06 crc kubenswrapper[4911]: I0310 14:29:06.547378 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" event={"ID":"3f912ff3-8e6b-4757-8708-865cb96e132e","Type":"ContainerStarted","Data":"052da05e6dffc6d324c21a33b8fc22897a6dd0d25596fa8acc80e6aafcaaeb2b"} Mar 10 14:29:10 crc kubenswrapper[4911]: I0310 14:29:10.194878 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:29:10 crc kubenswrapper[4911]: E0310 14:29:10.195427 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:29:16 crc kubenswrapper[4911]: I0310 14:29:16.871086 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 14:29:17 crc kubenswrapper[4911]: I0310 14:29:17.641483 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.232410 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gtgdt"] Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.249203 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gtgdt"] Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.249342 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.372881 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm5s2\" (UniqueName: \"kubernetes.io/projected/7180cfcc-020e-451d-b85e-6b0eee91dfbb-kube-api-access-pm5s2\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.373315 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-utilities\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.373935 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-catalog-content\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.476614 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-utilities\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.476768 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-catalog-content\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.476809 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm5s2\" (UniqueName: \"kubernetes.io/projected/7180cfcc-020e-451d-b85e-6b0eee91dfbb-kube-api-access-pm5s2\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.477299 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-utilities\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.477362 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-catalog-content\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.497589 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm5s2\" (UniqueName: \"kubernetes.io/projected/7180cfcc-020e-451d-b85e-6b0eee91dfbb-kube-api-access-pm5s2\") pod \"certified-operators-gtgdt\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.590310 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.730409 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" event={"ID":"3f912ff3-8e6b-4757-8708-865cb96e132e","Type":"ContainerStarted","Data":"218f25f46ce63c411ac6f10341eb99011b65108fcec5e3f105431d4e056b256f"} Mar 10 14:29:21 crc kubenswrapper[4911]: I0310 14:29:21.766908 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" podStartSLOduration=2.398811114 podStartE2EDuration="17.76687775s" podCreationTimestamp="2026-03-10 14:29:04 +0000 UTC" firstStartedPulling="2026-03-10 14:29:05.693535714 +0000 UTC m=+1650.257055641" lastFinishedPulling="2026-03-10 14:29:21.06160236 +0000 UTC m=+1665.625122277" observedRunningTime="2026-03-10 14:29:21.758266192 +0000 UTC m=+1666.321786109" watchObservedRunningTime="2026-03-10 14:29:21.76687775 +0000 UTC m=+1666.330397667" Mar 10 14:29:22 crc kubenswrapper[4911]: I0310 14:29:22.197171 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:29:22 crc kubenswrapper[4911]: E0310 14:29:22.197497 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:29:22 crc kubenswrapper[4911]: I0310 14:29:22.253159 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gtgdt"] Mar 10 14:29:22 crc kubenswrapper[4911]: I0310 14:29:22.755613 4911 generic.go:334] "Generic (PLEG): container finished" podID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerID="4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5" exitCode=0 Mar 10 14:29:22 crc kubenswrapper[4911]: I0310 14:29:22.755701 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtgdt" event={"ID":"7180cfcc-020e-451d-b85e-6b0eee91dfbb","Type":"ContainerDied","Data":"4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5"} Mar 10 14:29:22 crc kubenswrapper[4911]: I0310 14:29:22.755770 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtgdt" event={"ID":"7180cfcc-020e-451d-b85e-6b0eee91dfbb","Type":"ContainerStarted","Data":"47bf540584810876a701a262e5d517bac520f94be4a6d7744c780d193874380f"} Mar 10 14:29:23 crc kubenswrapper[4911]: I0310 14:29:23.768847 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtgdt" event={"ID":"7180cfcc-020e-451d-b85e-6b0eee91dfbb","Type":"ContainerStarted","Data":"7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e"} Mar 10 14:29:24 crc kubenswrapper[4911]: I0310 14:29:24.783636 4911 generic.go:334] "Generic (PLEG): container finished" podID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerID="7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e" exitCode=0 Mar 10 14:29:24 crc kubenswrapper[4911]: I0310 14:29:24.783764 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtgdt" event={"ID":"7180cfcc-020e-451d-b85e-6b0eee91dfbb","Type":"ContainerDied","Data":"7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e"} Mar 10 14:29:25 crc kubenswrapper[4911]: I0310 14:29:25.797443 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtgdt" event={"ID":"7180cfcc-020e-451d-b85e-6b0eee91dfbb","Type":"ContainerStarted","Data":"de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098"} Mar 10 14:29:25 crc kubenswrapper[4911]: I0310 14:29:25.828365 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gtgdt" podStartSLOduration=2.202996509 podStartE2EDuration="4.828337684s" podCreationTimestamp="2026-03-10 14:29:21 +0000 UTC" firstStartedPulling="2026-03-10 14:29:22.760265042 +0000 UTC m=+1667.323784969" lastFinishedPulling="2026-03-10 14:29:25.385606227 +0000 UTC m=+1669.949126144" observedRunningTime="2026-03-10 14:29:25.82367198 +0000 UTC m=+1670.387191897" watchObservedRunningTime="2026-03-10 14:29:25.828337684 +0000 UTC m=+1670.391857601" Mar 10 14:29:31 crc kubenswrapper[4911]: I0310 14:29:31.591223 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:31 crc kubenswrapper[4911]: I0310 14:29:31.592133 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:31 crc kubenswrapper[4911]: I0310 14:29:31.642719 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:31 crc kubenswrapper[4911]: I0310 14:29:31.917806 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:31 crc kubenswrapper[4911]: I0310 14:29:31.975602 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gtgdt"] Mar 10 14:29:32 crc kubenswrapper[4911]: I0310 14:29:32.879421 4911 generic.go:334] "Generic (PLEG): container finished" podID="3f912ff3-8e6b-4757-8708-865cb96e132e" containerID="218f25f46ce63c411ac6f10341eb99011b65108fcec5e3f105431d4e056b256f" exitCode=0 Mar 10 14:29:32 crc kubenswrapper[4911]: I0310 14:29:32.879572 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" event={"ID":"3f912ff3-8e6b-4757-8708-865cb96e132e","Type":"ContainerDied","Data":"218f25f46ce63c411ac6f10341eb99011b65108fcec5e3f105431d4e056b256f"} Mar 10 14:29:33 crc kubenswrapper[4911]: I0310 14:29:33.894059 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gtgdt" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerName="registry-server" containerID="cri-o://de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098" gracePeriod=2 Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.548240 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.555942 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.668642 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-ssh-key-openstack-edpm-ipam\") pod \"3f912ff3-8e6b-4757-8708-865cb96e132e\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.669530 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-inventory\") pod \"3f912ff3-8e6b-4757-8708-865cb96e132e\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.669774 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-utilities\") pod \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.669893 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-repo-setup-combined-ca-bundle\") pod \"3f912ff3-8e6b-4757-8708-865cb96e132e\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.670185 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqt4l\" (UniqueName: \"kubernetes.io/projected/3f912ff3-8e6b-4757-8708-865cb96e132e-kube-api-access-cqt4l\") pod \"3f912ff3-8e6b-4757-8708-865cb96e132e\" (UID: \"3f912ff3-8e6b-4757-8708-865cb96e132e\") " Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.670325 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-catalog-content\") pod \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.670443 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm5s2\" (UniqueName: \"kubernetes.io/projected/7180cfcc-020e-451d-b85e-6b0eee91dfbb-kube-api-access-pm5s2\") pod \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\" (UID: \"7180cfcc-020e-451d-b85e-6b0eee91dfbb\") " Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.671061 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-utilities" (OuterVolumeSpecName: "utilities") pod "7180cfcc-020e-451d-b85e-6b0eee91dfbb" (UID: "7180cfcc-020e-451d-b85e-6b0eee91dfbb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.671651 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.680847 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f912ff3-8e6b-4757-8708-865cb96e132e-kube-api-access-cqt4l" (OuterVolumeSpecName: "kube-api-access-cqt4l") pod "3f912ff3-8e6b-4757-8708-865cb96e132e" (UID: "3f912ff3-8e6b-4757-8708-865cb96e132e"). InnerVolumeSpecName "kube-api-access-cqt4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.692188 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3f912ff3-8e6b-4757-8708-865cb96e132e" (UID: "3f912ff3-8e6b-4757-8708-865cb96e132e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.692865 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7180cfcc-020e-451d-b85e-6b0eee91dfbb-kube-api-access-pm5s2" (OuterVolumeSpecName: "kube-api-access-pm5s2") pod "7180cfcc-020e-451d-b85e-6b0eee91dfbb" (UID: "7180cfcc-020e-451d-b85e-6b0eee91dfbb"). InnerVolumeSpecName "kube-api-access-pm5s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.706194 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3f912ff3-8e6b-4757-8708-865cb96e132e" (UID: "3f912ff3-8e6b-4757-8708-865cb96e132e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.706682 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-inventory" (OuterVolumeSpecName: "inventory") pod "3f912ff3-8e6b-4757-8708-865cb96e132e" (UID: "3f912ff3-8e6b-4757-8708-865cb96e132e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.735934 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7180cfcc-020e-451d-b85e-6b0eee91dfbb" (UID: "7180cfcc-020e-451d-b85e-6b0eee91dfbb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.786662 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqt4l\" (UniqueName: \"kubernetes.io/projected/3f912ff3-8e6b-4757-8708-865cb96e132e-kube-api-access-cqt4l\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.786696 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7180cfcc-020e-451d-b85e-6b0eee91dfbb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.786708 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm5s2\" (UniqueName: \"kubernetes.io/projected/7180cfcc-020e-451d-b85e-6b0eee91dfbb-kube-api-access-pm5s2\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.786735 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.786786 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.786801 4911 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f912ff3-8e6b-4757-8708-865cb96e132e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.911964 4911 generic.go:334] "Generic (PLEG): container finished" podID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerID="de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098" exitCode=0 Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.912058 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtgdt" event={"ID":"7180cfcc-020e-451d-b85e-6b0eee91dfbb","Type":"ContainerDied","Data":"de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098"} Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.912123 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtgdt" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.912160 4911 scope.go:117] "RemoveContainer" containerID="de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.912135 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtgdt" event={"ID":"7180cfcc-020e-451d-b85e-6b0eee91dfbb","Type":"ContainerDied","Data":"47bf540584810876a701a262e5d517bac520f94be4a6d7744c780d193874380f"} Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.915873 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" event={"ID":"3f912ff3-8e6b-4757-8708-865cb96e132e","Type":"ContainerDied","Data":"052da05e6dffc6d324c21a33b8fc22897a6dd0d25596fa8acc80e6aafcaaeb2b"} Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.915929 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="052da05e6dffc6d324c21a33b8fc22897a6dd0d25596fa8acc80e6aafcaaeb2b" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.916037 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.945922 4911 scope.go:117] "RemoveContainer" containerID="7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.984619 4911 scope.go:117] "RemoveContainer" containerID="4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5" Mar 10 14:29:34 crc kubenswrapper[4911]: I0310 14:29:34.990028 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gtgdt"] Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.013560 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gtgdt"] Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.025152 4911 scope.go:117] "RemoveContainer" containerID="de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098" Mar 10 14:29:35 crc kubenswrapper[4911]: E0310 14:29:35.030300 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098\": container with ID starting with de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098 not found: ID does not exist" containerID="de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.030368 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098"} err="failed to get container status \"de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098\": rpc error: code = NotFound desc = could not find container \"de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098\": container with ID starting with de48d1de81d320cec2b57873a79368a5c726b7000e88b26097642d2489001098 not found: ID does not exist" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.030410 4911 scope.go:117] "RemoveContainer" containerID="7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e" Mar 10 14:29:35 crc kubenswrapper[4911]: E0310 14:29:35.030967 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e\": container with ID starting with 7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e not found: ID does not exist" containerID="7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.030997 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e"} err="failed to get container status \"7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e\": rpc error: code = NotFound desc = could not find container \"7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e\": container with ID starting with 7b073b18cb1455a94e2e545a00c313c21e836e4450bfb37f9fbb2ed620ea1d3e not found: ID does not exist" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.031017 4911 scope.go:117] "RemoveContainer" containerID="4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5" Mar 10 14:29:35 crc kubenswrapper[4911]: E0310 14:29:35.031280 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5\": container with ID starting with 4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5 not found: ID does not exist" containerID="4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.031311 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5"} err="failed to get container status \"4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5\": rpc error: code = NotFound desc = could not find container \"4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5\": container with ID starting with 4b1c39a494845642514f38331b97d8b01e3905a20284b6fd017afd807476cbc5 not found: ID does not exist" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.037196 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm"] Mar 10 14:29:35 crc kubenswrapper[4911]: E0310 14:29:35.037778 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerName="registry-server" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.037795 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerName="registry-server" Mar 10 14:29:35 crc kubenswrapper[4911]: E0310 14:29:35.037816 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerName="extract-content" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.037823 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerName="extract-content" Mar 10 14:29:35 crc kubenswrapper[4911]: E0310 14:29:35.037846 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f912ff3-8e6b-4757-8708-865cb96e132e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.037854 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f912ff3-8e6b-4757-8708-865cb96e132e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 14:29:35 crc kubenswrapper[4911]: E0310 14:29:35.037862 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerName="extract-utilities" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.037869 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerName="extract-utilities" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.038054 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f912ff3-8e6b-4757-8708-865cb96e132e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.038082 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" containerName="registry-server" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.039058 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.046392 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.046715 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.047364 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.053918 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm"] Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.055380 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.195229 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.195450 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xdhr\" (UniqueName: \"kubernetes.io/projected/05cc5850-302b-49b9-a8d3-62654314670a-kube-api-access-6xdhr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.195533 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.297780 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xdhr\" (UniqueName: \"kubernetes.io/projected/05cc5850-302b-49b9-a8d3-62654314670a-kube-api-access-6xdhr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.297872 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.298025 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.303432 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.304243 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.315584 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xdhr\" (UniqueName: \"kubernetes.io/projected/05cc5850-302b-49b9-a8d3-62654314670a-kube-api-access-6xdhr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rc5zm\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:35 crc kubenswrapper[4911]: I0310 14:29:35.398755 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:36 crc kubenswrapper[4911]: I0310 14:29:36.018252 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm"] Mar 10 14:29:36 crc kubenswrapper[4911]: I0310 14:29:36.211911 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7180cfcc-020e-451d-b85e-6b0eee91dfbb" path="/var/lib/kubelet/pods/7180cfcc-020e-451d-b85e-6b0eee91dfbb/volumes" Mar 10 14:29:36 crc kubenswrapper[4911]: I0310 14:29:36.269187 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:29:36 crc kubenswrapper[4911]: I0310 14:29:36.949961 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" event={"ID":"05cc5850-302b-49b9-a8d3-62654314670a","Type":"ContainerStarted","Data":"c63fbb468ce64b3a0fc8ddc361c8ad5149d1d02bf304116c8bbe08e0aa2c3462"} Mar 10 14:29:36 crc kubenswrapper[4911]: I0310 14:29:36.950405 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" event={"ID":"05cc5850-302b-49b9-a8d3-62654314670a","Type":"ContainerStarted","Data":"d6aada70c86490788eeb9eb628f49b4482f4fea64b35f5e584a07666aaa50abc"} Mar 10 14:29:36 crc kubenswrapper[4911]: I0310 14:29:36.987297 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" podStartSLOduration=2.741051643 podStartE2EDuration="2.987269594s" podCreationTimestamp="2026-03-10 14:29:34 +0000 UTC" firstStartedPulling="2026-03-10 14:29:36.019553462 +0000 UTC m=+1680.583073389" lastFinishedPulling="2026-03-10 14:29:36.265771413 +0000 UTC m=+1680.829291340" observedRunningTime="2026-03-10 14:29:36.977348101 +0000 UTC m=+1681.540868028" watchObservedRunningTime="2026-03-10 14:29:36.987269594 +0000 UTC m=+1681.550789511" Mar 10 14:29:37 crc kubenswrapper[4911]: I0310 14:29:37.195651 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:29:37 crc kubenswrapper[4911]: E0310 14:29:37.196143 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:29:39 crc kubenswrapper[4911]: I0310 14:29:39.981860 4911 generic.go:334] "Generic (PLEG): container finished" podID="05cc5850-302b-49b9-a8d3-62654314670a" containerID="c63fbb468ce64b3a0fc8ddc361c8ad5149d1d02bf304116c8bbe08e0aa2c3462" exitCode=0 Mar 10 14:29:39 crc kubenswrapper[4911]: I0310 14:29:39.981946 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" event={"ID":"05cc5850-302b-49b9-a8d3-62654314670a","Type":"ContainerDied","Data":"c63fbb468ce64b3a0fc8ddc361c8ad5149d1d02bf304116c8bbe08e0aa2c3462"} Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.551474 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.665703 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-inventory\") pod \"05cc5850-302b-49b9-a8d3-62654314670a\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.665916 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xdhr\" (UniqueName: \"kubernetes.io/projected/05cc5850-302b-49b9-a8d3-62654314670a-kube-api-access-6xdhr\") pod \"05cc5850-302b-49b9-a8d3-62654314670a\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.666125 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-ssh-key-openstack-edpm-ipam\") pod \"05cc5850-302b-49b9-a8d3-62654314670a\" (UID: \"05cc5850-302b-49b9-a8d3-62654314670a\") " Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.672211 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cc5850-302b-49b9-a8d3-62654314670a-kube-api-access-6xdhr" (OuterVolumeSpecName: "kube-api-access-6xdhr") pod "05cc5850-302b-49b9-a8d3-62654314670a" (UID: "05cc5850-302b-49b9-a8d3-62654314670a"). InnerVolumeSpecName "kube-api-access-6xdhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.701959 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05cc5850-302b-49b9-a8d3-62654314670a" (UID: "05cc5850-302b-49b9-a8d3-62654314670a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.704348 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-inventory" (OuterVolumeSpecName: "inventory") pod "05cc5850-302b-49b9-a8d3-62654314670a" (UID: "05cc5850-302b-49b9-a8d3-62654314670a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.771413 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xdhr\" (UniqueName: \"kubernetes.io/projected/05cc5850-302b-49b9-a8d3-62654314670a-kube-api-access-6xdhr\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.771459 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:41 crc kubenswrapper[4911]: I0310 14:29:41.771473 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05cc5850-302b-49b9-a8d3-62654314670a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.019128 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" event={"ID":"05cc5850-302b-49b9-a8d3-62654314670a","Type":"ContainerDied","Data":"d6aada70c86490788eeb9eb628f49b4482f4fea64b35f5e584a07666aaa50abc"} Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.019462 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6aada70c86490788eeb9eb628f49b4482f4fea64b35f5e584a07666aaa50abc" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.019194 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rc5zm" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.079437 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8"] Mar 10 14:29:42 crc kubenswrapper[4911]: E0310 14:29:42.080191 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cc5850-302b-49b9-a8d3-62654314670a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.080220 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cc5850-302b-49b9-a8d3-62654314670a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.080572 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cc5850-302b-49b9-a8d3-62654314670a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.081677 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.087238 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.087521 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.087662 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.087713 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.098407 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8"] Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.178984 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5tgg\" (UniqueName: \"kubernetes.io/projected/98224edf-8b07-4753-87d9-4f6060957d74-kube-api-access-z5tgg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.179067 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.179098 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.179225 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.280695 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.280885 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5tgg\" (UniqueName: \"kubernetes.io/projected/98224edf-8b07-4753-87d9-4f6060957d74-kube-api-access-z5tgg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.280921 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.280944 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.290128 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.292158 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.292348 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.307060 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5tgg\" (UniqueName: \"kubernetes.io/projected/98224edf-8b07-4753-87d9-4f6060957d74-kube-api-access-z5tgg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.415585 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:29:42 crc kubenswrapper[4911]: I0310 14:29:42.980652 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8"] Mar 10 14:29:43 crc kubenswrapper[4911]: I0310 14:29:43.031478 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" event={"ID":"98224edf-8b07-4753-87d9-4f6060957d74","Type":"ContainerStarted","Data":"6da3a929fe3edadf5760059edf60165073400b231442050eebd778d0f6cfdefd"} Mar 10 14:29:44 crc kubenswrapper[4911]: I0310 14:29:44.044429 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" event={"ID":"98224edf-8b07-4753-87d9-4f6060957d74","Type":"ContainerStarted","Data":"03cce1a572eb07110f3d1cafe185c55e91d7931810e546bfdb5da8ab441ea289"} Mar 10 14:29:51 crc kubenswrapper[4911]: I0310 14:29:51.193599 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:29:51 crc kubenswrapper[4911]: E0310 14:29:51.195003 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:29:53 crc kubenswrapper[4911]: I0310 14:29:53.175028 4911 scope.go:117] "RemoveContainer" containerID="5a23957fa981e06933df72f1bb48cabd3f01029020258904eed5bd94e0c03126" Mar 10 14:29:53 crc kubenswrapper[4911]: I0310 14:29:53.201605 4911 scope.go:117] "RemoveContainer" containerID="051705810d7dccfba382dd3136cb032e1f4847adc0a17c010e8866398f0209d6" Mar 10 14:29:53 crc kubenswrapper[4911]: I0310 14:29:53.246754 4911 scope.go:117] "RemoveContainer" containerID="dd9670e57c5cef170cc3de9608ba90e388a6380241c3fe1a1378fdc2169ceb0b" Mar 10 14:29:53 crc kubenswrapper[4911]: I0310 14:29:53.285854 4911 scope.go:117] "RemoveContainer" containerID="627e04343ef71b078e29027642a423ae4c3db5daa0055df3e1697db87a6a0daf" Mar 10 14:29:53 crc kubenswrapper[4911]: I0310 14:29:53.316671 4911 scope.go:117] "RemoveContainer" containerID="87bf520dc0bf408a1604ba94be75b81792ccfe3d1b19d6c7f343973225cdc462" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.146459 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" podStartSLOduration=17.851971717 podStartE2EDuration="18.146425386s" podCreationTimestamp="2026-03-10 14:29:42 +0000 UTC" firstStartedPulling="2026-03-10 14:29:42.98530953 +0000 UTC m=+1687.548829447" lastFinishedPulling="2026-03-10 14:29:43.279763199 +0000 UTC m=+1687.843283116" observedRunningTime="2026-03-10 14:29:44.068449238 +0000 UTC m=+1688.631969145" watchObservedRunningTime="2026-03-10 14:30:00.146425386 +0000 UTC m=+1704.709945303" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.158285 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552550-lt684"] Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.160372 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552550-lt684" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.163217 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.163753 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.165435 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.173744 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg"] Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.175498 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.179165 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.179305 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.208318 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-secret-volume\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.208402 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whnxx\" (UniqueName: \"kubernetes.io/projected/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-kube-api-access-whnxx\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.208448 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z99s\" (UniqueName: \"kubernetes.io/projected/20567513-e7de-4124-aa52-e707e4a5eb14-kube-api-access-4z99s\") pod \"auto-csr-approver-29552550-lt684\" (UID: \"20567513-e7de-4124-aa52-e707e4a5eb14\") " pod="openshift-infra/auto-csr-approver-29552550-lt684" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.208536 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-config-volume\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.213985 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552550-lt684"] Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.223972 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg"] Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.310560 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-secret-volume\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.310634 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whnxx\" (UniqueName: \"kubernetes.io/projected/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-kube-api-access-whnxx\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.310663 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z99s\" (UniqueName: \"kubernetes.io/projected/20567513-e7de-4124-aa52-e707e4a5eb14-kube-api-access-4z99s\") pod \"auto-csr-approver-29552550-lt684\" (UID: \"20567513-e7de-4124-aa52-e707e4a5eb14\") " pod="openshift-infra/auto-csr-approver-29552550-lt684" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.310720 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-config-volume\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.311910 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-config-volume\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.324489 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-secret-volume\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.329516 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z99s\" (UniqueName: \"kubernetes.io/projected/20567513-e7de-4124-aa52-e707e4a5eb14-kube-api-access-4z99s\") pod \"auto-csr-approver-29552550-lt684\" (UID: \"20567513-e7de-4124-aa52-e707e4a5eb14\") " pod="openshift-infra/auto-csr-approver-29552550-lt684" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.329603 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whnxx\" (UniqueName: \"kubernetes.io/projected/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-kube-api-access-whnxx\") pod \"collect-profiles-29552550-r4dfg\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.491384 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552550-lt684" Mar 10 14:30:00 crc kubenswrapper[4911]: I0310 14:30:00.500551 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:01 crc kubenswrapper[4911]: I0310 14:30:01.029439 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg"] Mar 10 14:30:01 crc kubenswrapper[4911]: I0310 14:30:01.039779 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552550-lt684"] Mar 10 14:30:01 crc kubenswrapper[4911]: I0310 14:30:01.237777 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552550-lt684" event={"ID":"20567513-e7de-4124-aa52-e707e4a5eb14","Type":"ContainerStarted","Data":"aad749d9416eefbf887b41d130ab18e381a8e945f8ed926b7a98f8527f638ab9"} Mar 10 14:30:01 crc kubenswrapper[4911]: I0310 14:30:01.240477 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" event={"ID":"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a","Type":"ContainerStarted","Data":"fff3e4c39434663385030278f7b7fc196b9c22c4e96e3b80e198dc5a9f863b71"} Mar 10 14:30:01 crc kubenswrapper[4911]: I0310 14:30:01.240573 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" event={"ID":"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a","Type":"ContainerStarted","Data":"053b7cded9922ee79266fd6355d5d1a68df9faff1f077b3a83af62d1238bfdcd"} Mar 10 14:30:01 crc kubenswrapper[4911]: I0310 14:30:01.291446 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" podStartSLOduration=1.291416763 podStartE2EDuration="1.291416763s" podCreationTimestamp="2026-03-10 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:30:01.256755895 +0000 UTC m=+1705.820275812" watchObservedRunningTime="2026-03-10 14:30:01.291416763 +0000 UTC m=+1705.854936680" Mar 10 14:30:02 crc kubenswrapper[4911]: I0310 14:30:02.193782 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:30:02 crc kubenswrapper[4911]: E0310 14:30:02.194877 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:30:02 crc kubenswrapper[4911]: I0310 14:30:02.253178 4911 generic.go:334] "Generic (PLEG): container finished" podID="6b9ea89f-fb27-4fac-b4f4-d4252dd9338a" containerID="fff3e4c39434663385030278f7b7fc196b9c22c4e96e3b80e198dc5a9f863b71" exitCode=0 Mar 10 14:30:02 crc kubenswrapper[4911]: I0310 14:30:02.253236 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" event={"ID":"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a","Type":"ContainerDied","Data":"fff3e4c39434663385030278f7b7fc196b9c22c4e96e3b80e198dc5a9f863b71"} Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.670518 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.788938 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-config-volume\") pod \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.789339 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-secret-volume\") pod \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.789640 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whnxx\" (UniqueName: \"kubernetes.io/projected/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-kube-api-access-whnxx\") pod \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\" (UID: \"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a\") " Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.789915 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b9ea89f-fb27-4fac-b4f4-d4252dd9338a" (UID: "6b9ea89f-fb27-4fac-b4f4-d4252dd9338a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.792168 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.794876 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b9ea89f-fb27-4fac-b4f4-d4252dd9338a" (UID: "6b9ea89f-fb27-4fac-b4f4-d4252dd9338a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.799038 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-kube-api-access-whnxx" (OuterVolumeSpecName: "kube-api-access-whnxx") pod "6b9ea89f-fb27-4fac-b4f4-d4252dd9338a" (UID: "6b9ea89f-fb27-4fac-b4f4-d4252dd9338a"). InnerVolumeSpecName "kube-api-access-whnxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.894337 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whnxx\" (UniqueName: \"kubernetes.io/projected/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-kube-api-access-whnxx\") on node \"crc\" DevicePath \"\"" Mar 10 14:30:03 crc kubenswrapper[4911]: I0310 14:30:03.894596 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:30:04 crc kubenswrapper[4911]: I0310 14:30:04.282695 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" event={"ID":"6b9ea89f-fb27-4fac-b4f4-d4252dd9338a","Type":"ContainerDied","Data":"053b7cded9922ee79266fd6355d5d1a68df9faff1f077b3a83af62d1238bfdcd"} Mar 10 14:30:04 crc kubenswrapper[4911]: I0310 14:30:04.283136 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053b7cded9922ee79266fd6355d5d1a68df9faff1f077b3a83af62d1238bfdcd" Mar 10 14:30:04 crc kubenswrapper[4911]: I0310 14:30:04.282714 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg" Mar 10 14:30:04 crc kubenswrapper[4911]: I0310 14:30:04.284950 4911 generic.go:334] "Generic (PLEG): container finished" podID="20567513-e7de-4124-aa52-e707e4a5eb14" containerID="8f2346e6d23408706e1c7d4a3f0b3e851254f53cd8e7e0f9bc0d66f1da3100bf" exitCode=0 Mar 10 14:30:04 crc kubenswrapper[4911]: I0310 14:30:04.285003 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552550-lt684" event={"ID":"20567513-e7de-4124-aa52-e707e4a5eb14","Type":"ContainerDied","Data":"8f2346e6d23408706e1c7d4a3f0b3e851254f53cd8e7e0f9bc0d66f1da3100bf"} Mar 10 14:30:05 crc kubenswrapper[4911]: I0310 14:30:05.639473 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552550-lt684" Mar 10 14:30:05 crc kubenswrapper[4911]: I0310 14:30:05.728747 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z99s\" (UniqueName: \"kubernetes.io/projected/20567513-e7de-4124-aa52-e707e4a5eb14-kube-api-access-4z99s\") pod \"20567513-e7de-4124-aa52-e707e4a5eb14\" (UID: \"20567513-e7de-4124-aa52-e707e4a5eb14\") " Mar 10 14:30:05 crc kubenswrapper[4911]: I0310 14:30:05.735155 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20567513-e7de-4124-aa52-e707e4a5eb14-kube-api-access-4z99s" (OuterVolumeSpecName: "kube-api-access-4z99s") pod "20567513-e7de-4124-aa52-e707e4a5eb14" (UID: "20567513-e7de-4124-aa52-e707e4a5eb14"). InnerVolumeSpecName "kube-api-access-4z99s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:30:05 crc kubenswrapper[4911]: I0310 14:30:05.831807 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z99s\" (UniqueName: \"kubernetes.io/projected/20567513-e7de-4124-aa52-e707e4a5eb14-kube-api-access-4z99s\") on node \"crc\" DevicePath \"\"" Mar 10 14:30:06 crc kubenswrapper[4911]: I0310 14:30:06.309019 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552550-lt684" Mar 10 14:30:06 crc kubenswrapper[4911]: I0310 14:30:06.308931 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552550-lt684" event={"ID":"20567513-e7de-4124-aa52-e707e4a5eb14","Type":"ContainerDied","Data":"aad749d9416eefbf887b41d130ab18e381a8e945f8ed926b7a98f8527f638ab9"} Mar 10 14:30:06 crc kubenswrapper[4911]: I0310 14:30:06.311057 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aad749d9416eefbf887b41d130ab18e381a8e945f8ed926b7a98f8527f638ab9" Mar 10 14:30:06 crc kubenswrapper[4911]: I0310 14:30:06.712380 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552544-2kc8q"] Mar 10 14:30:06 crc kubenswrapper[4911]: I0310 14:30:06.721694 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552544-2kc8q"] Mar 10 14:30:08 crc kubenswrapper[4911]: I0310 14:30:08.213650 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31175aa-9ebc-4576-a218-b9d926c1c559" path="/var/lib/kubelet/pods/b31175aa-9ebc-4576-a218-b9d926c1c559/volumes" Mar 10 14:30:16 crc kubenswrapper[4911]: I0310 14:30:16.202396 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:30:16 crc kubenswrapper[4911]: E0310 14:30:16.203319 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:30:30 crc kubenswrapper[4911]: I0310 14:30:30.195848 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:30:30 crc kubenswrapper[4911]: E0310 14:30:30.196834 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:30:42 crc kubenswrapper[4911]: I0310 14:30:42.194564 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:30:42 crc kubenswrapper[4911]: E0310 14:30:42.195755 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:30:53 crc kubenswrapper[4911]: I0310 14:30:53.468334 4911 scope.go:117] "RemoveContainer" containerID="bc04872203b2bef81c1fb176971693f1ccf0e54cdd64ce5625bab7414127f73c" Mar 10 14:30:54 crc kubenswrapper[4911]: I0310 14:30:54.194963 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:30:54 crc kubenswrapper[4911]: E0310 14:30:54.195443 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:31:06 crc kubenswrapper[4911]: I0310 14:31:06.198073 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:31:06 crc kubenswrapper[4911]: E0310 14:31:06.198962 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:31:20 crc kubenswrapper[4911]: I0310 14:31:20.193674 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:31:20 crc kubenswrapper[4911]: E0310 14:31:20.194460 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:31:32 crc kubenswrapper[4911]: I0310 14:31:32.195367 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:31:32 crc kubenswrapper[4911]: E0310 14:31:32.196186 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:31:45 crc kubenswrapper[4911]: I0310 14:31:45.193301 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:31:45 crc kubenswrapper[4911]: E0310 14:31:45.194095 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:31:53 crc kubenswrapper[4911]: I0310 14:31:53.577769 4911 scope.go:117] "RemoveContainer" containerID="397d168ebab52c7e71bd34ca9a558a1e2dafa657ecded32ed09f11134555bcb3" Mar 10 14:31:53 crc kubenswrapper[4911]: I0310 14:31:53.606082 4911 scope.go:117] "RemoveContainer" containerID="97607404fb6fa193b39935df1b99dba949d260c588edf94d288757513a40b1f3" Mar 10 14:31:53 crc kubenswrapper[4911]: I0310 14:31:53.634247 4911 scope.go:117] "RemoveContainer" containerID="c8e7a328b8fe1b1c77334bc0dc99f9baabb5321c23a258c9ab2202910d51db76" Mar 10 14:31:53 crc kubenswrapper[4911]: I0310 14:31:53.659758 4911 scope.go:117] "RemoveContainer" containerID="c5f4bd0e4b31d2e2feeb9bb46bd491108d8930c4da09276dacc07aeca5be0c6e" Mar 10 14:31:53 crc kubenswrapper[4911]: I0310 14:31:53.682676 4911 scope.go:117] "RemoveContainer" containerID="d2b9bfc16bf3c24d6ab95cc5111f2866b68518587ee5b4309371c9083df253e5" Mar 10 14:31:56 crc kubenswrapper[4911]: I0310 14:31:56.198009 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:31:56 crc kubenswrapper[4911]: E0310 14:31:56.198563 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.171520 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552552-hclld"] Mar 10 14:32:00 crc kubenswrapper[4911]: E0310 14:32:00.172504 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9ea89f-fb27-4fac-b4f4-d4252dd9338a" containerName="collect-profiles" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.172527 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9ea89f-fb27-4fac-b4f4-d4252dd9338a" containerName="collect-profiles" Mar 10 14:32:00 crc kubenswrapper[4911]: E0310 14:32:00.172553 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20567513-e7de-4124-aa52-e707e4a5eb14" containerName="oc" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.172562 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="20567513-e7de-4124-aa52-e707e4a5eb14" containerName="oc" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.172845 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9ea89f-fb27-4fac-b4f4-d4252dd9338a" containerName="collect-profiles" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.172877 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="20567513-e7de-4124-aa52-e707e4a5eb14" containerName="oc" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.173698 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552552-hclld" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.176752 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.177358 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.188828 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.212340 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552552-hclld"] Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.364589 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czddd\" (UniqueName: \"kubernetes.io/projected/06250593-6f10-4848-a3d7-dc782bcee227-kube-api-access-czddd\") pod \"auto-csr-approver-29552552-hclld\" (UID: \"06250593-6f10-4848-a3d7-dc782bcee227\") " pod="openshift-infra/auto-csr-approver-29552552-hclld" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.467091 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czddd\" (UniqueName: \"kubernetes.io/projected/06250593-6f10-4848-a3d7-dc782bcee227-kube-api-access-czddd\") pod \"auto-csr-approver-29552552-hclld\" (UID: \"06250593-6f10-4848-a3d7-dc782bcee227\") " pod="openshift-infra/auto-csr-approver-29552552-hclld" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.494582 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czddd\" (UniqueName: \"kubernetes.io/projected/06250593-6f10-4848-a3d7-dc782bcee227-kube-api-access-czddd\") pod \"auto-csr-approver-29552552-hclld\" (UID: \"06250593-6f10-4848-a3d7-dc782bcee227\") " pod="openshift-infra/auto-csr-approver-29552552-hclld" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.504120 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552552-hclld" Mar 10 14:32:00 crc kubenswrapper[4911]: I0310 14:32:00.998794 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552552-hclld"] Mar 10 14:32:01 crc kubenswrapper[4911]: I0310 14:32:01.517511 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552552-hclld" event={"ID":"06250593-6f10-4848-a3d7-dc782bcee227","Type":"ContainerStarted","Data":"8bce177dccbd8452708ce1d7e51ba8bcf441565ea8f56f677083fb51d38bb4d4"} Mar 10 14:32:04 crc kubenswrapper[4911]: I0310 14:32:04.549879 4911 generic.go:334] "Generic (PLEG): container finished" podID="06250593-6f10-4848-a3d7-dc782bcee227" containerID="81f42c0acca3bb9babf6d48a2090da8bc8ad314cd86fed989d4636807fc6cce6" exitCode=0 Mar 10 14:32:04 crc kubenswrapper[4911]: I0310 14:32:04.549974 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552552-hclld" event={"ID":"06250593-6f10-4848-a3d7-dc782bcee227","Type":"ContainerDied","Data":"81f42c0acca3bb9babf6d48a2090da8bc8ad314cd86fed989d4636807fc6cce6"} Mar 10 14:32:06 crc kubenswrapper[4911]: I0310 14:32:06.101382 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552552-hclld" Mar 10 14:32:06 crc kubenswrapper[4911]: I0310 14:32:06.192248 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czddd\" (UniqueName: \"kubernetes.io/projected/06250593-6f10-4848-a3d7-dc782bcee227-kube-api-access-czddd\") pod \"06250593-6f10-4848-a3d7-dc782bcee227\" (UID: \"06250593-6f10-4848-a3d7-dc782bcee227\") " Mar 10 14:32:06 crc kubenswrapper[4911]: I0310 14:32:06.206142 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06250593-6f10-4848-a3d7-dc782bcee227-kube-api-access-czddd" (OuterVolumeSpecName: "kube-api-access-czddd") pod "06250593-6f10-4848-a3d7-dc782bcee227" (UID: "06250593-6f10-4848-a3d7-dc782bcee227"). InnerVolumeSpecName "kube-api-access-czddd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:32:06 crc kubenswrapper[4911]: I0310 14:32:06.294837 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czddd\" (UniqueName: \"kubernetes.io/projected/06250593-6f10-4848-a3d7-dc782bcee227-kube-api-access-czddd\") on node \"crc\" DevicePath \"\"" Mar 10 14:32:06 crc kubenswrapper[4911]: I0310 14:32:06.574278 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552552-hclld" event={"ID":"06250593-6f10-4848-a3d7-dc782bcee227","Type":"ContainerDied","Data":"8bce177dccbd8452708ce1d7e51ba8bcf441565ea8f56f677083fb51d38bb4d4"} Mar 10 14:32:06 crc kubenswrapper[4911]: I0310 14:32:06.574756 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bce177dccbd8452708ce1d7e51ba8bcf441565ea8f56f677083fb51d38bb4d4" Mar 10 14:32:06 crc kubenswrapper[4911]: I0310 14:32:06.574401 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552552-hclld" Mar 10 14:32:07 crc kubenswrapper[4911]: I0310 14:32:07.182695 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552546-rw8md"] Mar 10 14:32:07 crc kubenswrapper[4911]: I0310 14:32:07.193397 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552546-rw8md"] Mar 10 14:32:07 crc kubenswrapper[4911]: I0310 14:32:07.194274 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:32:07 crc kubenswrapper[4911]: E0310 14:32:07.195008 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:32:08 crc kubenswrapper[4911]: I0310 14:32:08.207424 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e" path="/var/lib/kubelet/pods/a2dbe466-bd9f-44bb-90d0-7f3ccdc4641e/volumes" Mar 10 14:32:19 crc kubenswrapper[4911]: I0310 14:32:19.077586 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nk2c4"] Mar 10 14:32:19 crc kubenswrapper[4911]: I0310 14:32:19.092492 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7df8-account-create-update-d9npt"] Mar 10 14:32:19 crc kubenswrapper[4911]: I0310 14:32:19.104408 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nk2c4"] Mar 10 14:32:19 crc kubenswrapper[4911]: I0310 14:32:19.115018 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7df8-account-create-update-d9npt"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.033330 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6rbps"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.043800 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ac1c-account-create-update-2vkc9"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.056184 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6rbps"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.070928 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-47sff"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.080189 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ac1c-account-create-update-2vkc9"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.088182 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b41f-account-create-update-h7v89"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.096911 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-47sff"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.105510 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b41f-account-create-update-h7v89"] Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.193230 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:32:20 crc kubenswrapper[4911]: E0310 14:32:20.193494 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.205139 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a50823-d870-43f3-8599-ac8cf13ce7c8" path="/var/lib/kubelet/pods/14a50823-d870-43f3-8599-ac8cf13ce7c8/volumes" Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.205793 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16670393-24c4-4df7-b240-0509b1866a1a" path="/var/lib/kubelet/pods/16670393-24c4-4df7-b240-0509b1866a1a/volumes" Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.206360 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f1f748-2a8f-4786-907a-6800ad1c999c" path="/var/lib/kubelet/pods/16f1f748-2a8f-4786-907a-6800ad1c999c/volumes" Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.206945 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71f3b5bd-f962-4c8b-a792-01badc31a9d4" path="/var/lib/kubelet/pods/71f3b5bd-f962-4c8b-a792-01badc31a9d4/volumes" Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.208136 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bda7dfa-e3a9-4f11-9872-b83c855f7df6" path="/var/lib/kubelet/pods/7bda7dfa-e3a9-4f11-9872-b83c855f7df6/volumes" Mar 10 14:32:20 crc kubenswrapper[4911]: I0310 14:32:20.208750 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8905bd52-ff68-4e7c-818f-bf2a38b80b8f" path="/var/lib/kubelet/pods/8905bd52-ff68-4e7c-818f-bf2a38b80b8f/volumes" Mar 10 14:32:32 crc kubenswrapper[4911]: I0310 14:32:32.194573 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:32:32 crc kubenswrapper[4911]: E0310 14:32:32.195410 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:32:40 crc kubenswrapper[4911]: I0310 14:32:40.911833 4911 generic.go:334] "Generic (PLEG): container finished" podID="98224edf-8b07-4753-87d9-4f6060957d74" containerID="03cce1a572eb07110f3d1cafe185c55e91d7931810e546bfdb5da8ab441ea289" exitCode=0 Mar 10 14:32:40 crc kubenswrapper[4911]: I0310 14:32:40.911985 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" event={"ID":"98224edf-8b07-4753-87d9-4f6060957d74","Type":"ContainerDied","Data":"03cce1a572eb07110f3d1cafe185c55e91d7931810e546bfdb5da8ab441ea289"} Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.326369 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.506569 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5tgg\" (UniqueName: \"kubernetes.io/projected/98224edf-8b07-4753-87d9-4f6060957d74-kube-api-access-z5tgg\") pod \"98224edf-8b07-4753-87d9-4f6060957d74\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.506713 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-ssh-key-openstack-edpm-ipam\") pod \"98224edf-8b07-4753-87d9-4f6060957d74\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.506756 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-bootstrap-combined-ca-bundle\") pod \"98224edf-8b07-4753-87d9-4f6060957d74\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.506842 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-inventory\") pod \"98224edf-8b07-4753-87d9-4f6060957d74\" (UID: \"98224edf-8b07-4753-87d9-4f6060957d74\") " Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.512673 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "98224edf-8b07-4753-87d9-4f6060957d74" (UID: "98224edf-8b07-4753-87d9-4f6060957d74"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.513957 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98224edf-8b07-4753-87d9-4f6060957d74-kube-api-access-z5tgg" (OuterVolumeSpecName: "kube-api-access-z5tgg") pod "98224edf-8b07-4753-87d9-4f6060957d74" (UID: "98224edf-8b07-4753-87d9-4f6060957d74"). InnerVolumeSpecName "kube-api-access-z5tgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.532873 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "98224edf-8b07-4753-87d9-4f6060957d74" (UID: "98224edf-8b07-4753-87d9-4f6060957d74"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.533900 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-inventory" (OuterVolumeSpecName: "inventory") pod "98224edf-8b07-4753-87d9-4f6060957d74" (UID: "98224edf-8b07-4753-87d9-4f6060957d74"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.609763 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5tgg\" (UniqueName: \"kubernetes.io/projected/98224edf-8b07-4753-87d9-4f6060957d74-kube-api-access-z5tgg\") on node \"crc\" DevicePath \"\"" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.609822 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.609840 4911 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.609856 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98224edf-8b07-4753-87d9-4f6060957d74-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.933098 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" event={"ID":"98224edf-8b07-4753-87d9-4f6060957d74","Type":"ContainerDied","Data":"6da3a929fe3edadf5760059edf60165073400b231442050eebd778d0f6cfdefd"} Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.933422 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da3a929fe3edadf5760059edf60165073400b231442050eebd778d0f6cfdefd" Mar 10 14:32:42 crc kubenswrapper[4911]: I0310 14:32:42.933150 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.034193 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb"] Mar 10 14:32:43 crc kubenswrapper[4911]: E0310 14:32:43.034919 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06250593-6f10-4848-a3d7-dc782bcee227" containerName="oc" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.034943 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="06250593-6f10-4848-a3d7-dc782bcee227" containerName="oc" Mar 10 14:32:43 crc kubenswrapper[4911]: E0310 14:32:43.034983 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98224edf-8b07-4753-87d9-4f6060957d74" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.034996 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="98224edf-8b07-4753-87d9-4f6060957d74" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.035244 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="98224edf-8b07-4753-87d9-4f6060957d74" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.035285 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="06250593-6f10-4848-a3d7-dc782bcee227" containerName="oc" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.036279 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.040838 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.041005 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.041900 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.045492 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.049359 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb"] Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.068448 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7fwgv"] Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.077873 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7fwgv"] Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.225665 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.226201 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrfw\" (UniqueName: \"kubernetes.io/projected/37e4aed4-039e-4b2b-89d7-65c43eb8f688-kube-api-access-knrfw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.226427 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.328296 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.328976 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.329099 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knrfw\" (UniqueName: \"kubernetes.io/projected/37e4aed4-039e-4b2b-89d7-65c43eb8f688-kube-api-access-knrfw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.333618 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.334809 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.350754 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrfw\" (UniqueName: \"kubernetes.io/projected/37e4aed4-039e-4b2b-89d7-65c43eb8f688-kube-api-access-knrfw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.376929 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.949767 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb"] Mar 10 14:32:43 crc kubenswrapper[4911]: I0310 14:32:43.962067 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:32:44 crc kubenswrapper[4911]: I0310 14:32:44.212011 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e91396-f5e0-46f1-99ce-394c03a93db3" path="/var/lib/kubelet/pods/83e91396-f5e0-46f1-99ce-394c03a93db3/volumes" Mar 10 14:32:44 crc kubenswrapper[4911]: I0310 14:32:44.955523 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" event={"ID":"37e4aed4-039e-4b2b-89d7-65c43eb8f688","Type":"ContainerStarted","Data":"a1fc93cd561ee81dd705a9c871f0715d6129ae4a0df1ac57a11d897fccddfeac"} Mar 10 14:32:44 crc kubenswrapper[4911]: I0310 14:32:44.955958 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" event={"ID":"37e4aed4-039e-4b2b-89d7-65c43eb8f688","Type":"ContainerStarted","Data":"69586b536fe0bda40e744761e30fced3a1f0ad2bbff68f0b706c44b4fa4f39e3"} Mar 10 14:32:44 crc kubenswrapper[4911]: I0310 14:32:44.982234 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" podStartSLOduration=1.808045426 podStartE2EDuration="1.982206646s" podCreationTimestamp="2026-03-10 14:32:43 +0000 UTC" firstStartedPulling="2026-03-10 14:32:43.961581983 +0000 UTC m=+1868.525101900" lastFinishedPulling="2026-03-10 14:32:44.135743193 +0000 UTC m=+1868.699263120" observedRunningTime="2026-03-10 14:32:44.970814843 +0000 UTC m=+1869.534334761" watchObservedRunningTime="2026-03-10 14:32:44.982206646 +0000 UTC m=+1869.545726573" Mar 10 14:32:46 crc kubenswrapper[4911]: I0310 14:32:46.200386 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:32:46 crc kubenswrapper[4911]: E0310 14:32:46.200716 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:32:50 crc kubenswrapper[4911]: I0310 14:32:50.033671 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6v9wc"] Mar 10 14:32:50 crc kubenswrapper[4911]: I0310 14:32:50.046248 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6v9wc"] Mar 10 14:32:50 crc kubenswrapper[4911]: I0310 14:32:50.204950 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a49be6ea-3e0f-4975-89ff-9cfd022e4980" path="/var/lib/kubelet/pods/a49be6ea-3e0f-4975-89ff-9cfd022e4980/volumes" Mar 10 14:32:53 crc kubenswrapper[4911]: I0310 14:32:53.774069 4911 scope.go:117] "RemoveContainer" containerID="e36118380955427affb8dc83b7d5aac3c92cfe0618dd0fd5bc1e0373caf12157" Mar 10 14:32:53 crc kubenswrapper[4911]: I0310 14:32:53.841231 4911 scope.go:117] "RemoveContainer" containerID="3239adfda8fcd0f880275aa7f28ea5f8a24c039c002aa9def32a7ee99f37b24b" Mar 10 14:32:53 crc kubenswrapper[4911]: I0310 14:32:53.867376 4911 scope.go:117] "RemoveContainer" containerID="654710fd98e60b38b77a58e75ae5052b12e196850bf1051c9beecb856314b843" Mar 10 14:32:53 crc kubenswrapper[4911]: I0310 14:32:53.916684 4911 scope.go:117] "RemoveContainer" containerID="7c95aadb6d53bb8b916f8fe848a8a68b6048ba1a8a92df065a960f73dcd7ddad" Mar 10 14:32:53 crc kubenswrapper[4911]: I0310 14:32:53.962406 4911 scope.go:117] "RemoveContainer" containerID="e1c86c8fa4589719e51bcea2266d9340cdfb3e3d41126c995121630935c5337e" Mar 10 14:32:54 crc kubenswrapper[4911]: I0310 14:32:54.036187 4911 scope.go:117] "RemoveContainer" containerID="1373d6351cb965a4a3cceec35a025110e2796cc36818ab19142c65a000244b44" Mar 10 14:32:54 crc kubenswrapper[4911]: I0310 14:32:54.082198 4911 scope.go:117] "RemoveContainer" containerID="ef23ba72f589d96b300db6d4d42e933c2309b6c7e54564487471dc7ac93699f1" Mar 10 14:32:54 crc kubenswrapper[4911]: I0310 14:32:54.105904 4911 scope.go:117] "RemoveContainer" containerID="4271682c8fca9a85282ff7eeb6a7eae6f6496bed656befdc203c85d336bfc4a8" Mar 10 14:32:54 crc kubenswrapper[4911]: I0310 14:32:54.127477 4911 scope.go:117] "RemoveContainer" containerID="35d01e8c663507a7d71b55c9602d82e99b54f742c413bd58ad0542bd00b36ea5" Mar 10 14:33:01 crc kubenswrapper[4911]: I0310 14:33:01.194265 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:33:01 crc kubenswrapper[4911]: E0310 14:33:01.195424 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.085282 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-vsqwc"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.094641 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3438-account-create-update-6zd8p"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.106067 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9400-account-create-update-bl6w7"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.115838 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gq52w"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.125529 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3ba5-account-create-update-vrg4k"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.134997 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h9vhh"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.157536 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9400-account-create-update-bl6w7"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.157627 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3ba5-account-create-update-vrg4k"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.177712 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h9vhh"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.177815 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gq52w"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.184540 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3438-account-create-update-6zd8p"] Mar 10 14:33:11 crc kubenswrapper[4911]: I0310 14:33:11.190684 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-vsqwc"] Mar 10 14:33:12 crc kubenswrapper[4911]: I0310 14:33:12.215036 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316f04a5-be26-4b2b-beb5-54c8851c589f" path="/var/lib/kubelet/pods/316f04a5-be26-4b2b-beb5-54c8851c589f/volumes" Mar 10 14:33:12 crc kubenswrapper[4911]: I0310 14:33:12.215749 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e171a5e-2cdb-458b-9fe3-aeb5cde435c4" path="/var/lib/kubelet/pods/4e171a5e-2cdb-458b-9fe3-aeb5cde435c4/volumes" Mar 10 14:33:12 crc kubenswrapper[4911]: I0310 14:33:12.216303 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e568bb4-835b-4478-a92e-c8a2bd91f48c" path="/var/lib/kubelet/pods/9e568bb4-835b-4478-a92e-c8a2bd91f48c/volumes" Mar 10 14:33:12 crc kubenswrapper[4911]: I0310 14:33:12.216919 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b856dbd6-5849-42aa-a88a-942fd434e4cd" path="/var/lib/kubelet/pods/b856dbd6-5849-42aa-a88a-942fd434e4cd/volumes" Mar 10 14:33:12 crc kubenswrapper[4911]: I0310 14:33:12.218191 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4f56e0-2bd6-482b-a147-9fccecd2aecc" path="/var/lib/kubelet/pods/be4f56e0-2bd6-482b-a147-9fccecd2aecc/volumes" Mar 10 14:33:12 crc kubenswrapper[4911]: I0310 14:33:12.218804 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f420a99e-9a7a-402f-abef-e299c12a33bc" path="/var/lib/kubelet/pods/f420a99e-9a7a-402f-abef-e299c12a33bc/volumes" Mar 10 14:33:15 crc kubenswrapper[4911]: I0310 14:33:15.193597 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:33:15 crc kubenswrapper[4911]: E0310 14:33:15.194542 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:33:16 crc kubenswrapper[4911]: I0310 14:33:16.047974 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hs9pl"] Mar 10 14:33:16 crc kubenswrapper[4911]: I0310 14:33:16.056881 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hs9pl"] Mar 10 14:33:16 crc kubenswrapper[4911]: I0310 14:33:16.210171 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa710f1a-fd18-49b5-bdd1-3afbe21047ec" path="/var/lib/kubelet/pods/fa710f1a-fd18-49b5-bdd1-3afbe21047ec/volumes" Mar 10 14:33:29 crc kubenswrapper[4911]: I0310 14:33:29.194341 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:33:29 crc kubenswrapper[4911]: I0310 14:33:29.844426 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"4cb4b818a888d5417b0612f19c0270a3d874b51a5ee2051ea9d2db487bebc236"} Mar 10 14:33:47 crc kubenswrapper[4911]: I0310 14:33:47.052109 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wk4q4"] Mar 10 14:33:47 crc kubenswrapper[4911]: I0310 14:33:47.064936 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wk4q4"] Mar 10 14:33:48 crc kubenswrapper[4911]: I0310 14:33:48.210354 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0db6312-44e4-4765-aabf-1d8620893756" path="/var/lib/kubelet/pods/f0db6312-44e4-4765-aabf-1d8620893756/volumes" Mar 10 14:33:54 crc kubenswrapper[4911]: I0310 14:33:54.315268 4911 scope.go:117] "RemoveContainer" containerID="8298867308da7df6b329a7316ecb7f689329f3d3eed6c906c0e9b44c2d7125ef" Mar 10 14:33:54 crc kubenswrapper[4911]: I0310 14:33:54.359934 4911 scope.go:117] "RemoveContainer" containerID="80b7f93e162fca480a7790b54ad1f4f967f99683aaa9320f6c7cd2223ef3704a" Mar 10 14:33:54 crc kubenswrapper[4911]: I0310 14:33:54.396492 4911 scope.go:117] "RemoveContainer" containerID="2bd551f48cab288b0d0fb2e8adb031410c098f214687834958bc3337ddfab601" Mar 10 14:33:54 crc kubenswrapper[4911]: I0310 14:33:54.465857 4911 scope.go:117] "RemoveContainer" containerID="bbefc3fbf5583dfe5be76c340bf542f4715b5abaac8cd374f710ec6137a24e23" Mar 10 14:33:54 crc kubenswrapper[4911]: I0310 14:33:54.507588 4911 scope.go:117] "RemoveContainer" containerID="38c1707684e3284b1a4f9941449161009abe645bb461871361585173ad792c23" Mar 10 14:33:54 crc kubenswrapper[4911]: I0310 14:33:54.585491 4911 scope.go:117] "RemoveContainer" containerID="88ffca46975943a274764ed341dbce2416b33f42d0a79fa7a271350471da528f" Mar 10 14:33:54 crc kubenswrapper[4911]: I0310 14:33:54.608677 4911 scope.go:117] "RemoveContainer" containerID="7e0018a8a78228a04af27046b61e98bbd97a4f3e18c6cee45893c102e2ebb75b" Mar 10 14:33:54 crc kubenswrapper[4911]: I0310 14:33:54.668083 4911 scope.go:117] "RemoveContainer" containerID="712a84bd650d353d290828d088e888703a58b95b23832a4b73e1ff00ca78cc71" Mar 10 14:33:59 crc kubenswrapper[4911]: I0310 14:33:59.040124 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9w5xj"] Mar 10 14:33:59 crc kubenswrapper[4911]: I0310 14:33:59.050422 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9w5xj"] Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.152094 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552554-t65c4"] Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.156974 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552554-t65c4" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.162227 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.162741 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.163052 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.219705 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f554253-d7fc-4ca5-8f4b-fafd051fbee4" path="/var/lib/kubelet/pods/6f554253-d7fc-4ca5-8f4b-fafd051fbee4/volumes" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.220687 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552554-t65c4"] Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.336243 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpq95\" (UniqueName: \"kubernetes.io/projected/e91a7cec-4b2e-4ec7-bb08-f5d6043394cc-kube-api-access-kpq95\") pod \"auto-csr-approver-29552554-t65c4\" (UID: \"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc\") " pod="openshift-infra/auto-csr-approver-29552554-t65c4" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.439086 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpq95\" (UniqueName: \"kubernetes.io/projected/e91a7cec-4b2e-4ec7-bb08-f5d6043394cc-kube-api-access-kpq95\") pod \"auto-csr-approver-29552554-t65c4\" (UID: \"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc\") " pod="openshift-infra/auto-csr-approver-29552554-t65c4" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.464861 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpq95\" (UniqueName: \"kubernetes.io/projected/e91a7cec-4b2e-4ec7-bb08-f5d6043394cc-kube-api-access-kpq95\") pod \"auto-csr-approver-29552554-t65c4\" (UID: \"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc\") " pod="openshift-infra/auto-csr-approver-29552554-t65c4" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.487214 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552554-t65c4" Mar 10 14:34:00 crc kubenswrapper[4911]: I0310 14:34:00.968232 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552554-t65c4"] Mar 10 14:34:01 crc kubenswrapper[4911]: I0310 14:34:01.161313 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552554-t65c4" event={"ID":"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc","Type":"ContainerStarted","Data":"43223fc875dee9ef64f226f8344b56a511547673a6438750991014b906e39338"} Mar 10 14:34:02 crc kubenswrapper[4911]: I0310 14:34:02.173380 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552554-t65c4" event={"ID":"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc","Type":"ContainerStarted","Data":"ef827851c33cd04d9615d1757eac7601489214572902f15b804fa36a54086471"} Mar 10 14:34:02 crc kubenswrapper[4911]: I0310 14:34:02.200332 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552554-t65c4" podStartSLOduration=1.340955728 podStartE2EDuration="2.200299592s" podCreationTimestamp="2026-03-10 14:34:00 +0000 UTC" firstStartedPulling="2026-03-10 14:34:00.978087263 +0000 UTC m=+1945.541607180" lastFinishedPulling="2026-03-10 14:34:01.837431127 +0000 UTC m=+1946.400951044" observedRunningTime="2026-03-10 14:34:02.190381399 +0000 UTC m=+1946.753901326" watchObservedRunningTime="2026-03-10 14:34:02.200299592 +0000 UTC m=+1946.763819509" Mar 10 14:34:03 crc kubenswrapper[4911]: I0310 14:34:03.186296 4911 generic.go:334] "Generic (PLEG): container finished" podID="e91a7cec-4b2e-4ec7-bb08-f5d6043394cc" containerID="ef827851c33cd04d9615d1757eac7601489214572902f15b804fa36a54086471" exitCode=0 Mar 10 14:34:03 crc kubenswrapper[4911]: I0310 14:34:03.186381 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552554-t65c4" event={"ID":"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc","Type":"ContainerDied","Data":"ef827851c33cd04d9615d1757eac7601489214572902f15b804fa36a54086471"} Mar 10 14:34:04 crc kubenswrapper[4911]: I0310 14:34:04.040463 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2s85z"] Mar 10 14:34:04 crc kubenswrapper[4911]: I0310 14:34:04.054576 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2s85z"] Mar 10 14:34:04 crc kubenswrapper[4911]: I0310 14:34:04.214280 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c9857f-704f-48ce-b90b-9275e9eba41a" path="/var/lib/kubelet/pods/86c9857f-704f-48ce-b90b-9275e9eba41a/volumes" Mar 10 14:34:04 crc kubenswrapper[4911]: I0310 14:34:04.551926 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552554-t65c4" Mar 10 14:34:04 crc kubenswrapper[4911]: I0310 14:34:04.642831 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpq95\" (UniqueName: \"kubernetes.io/projected/e91a7cec-4b2e-4ec7-bb08-f5d6043394cc-kube-api-access-kpq95\") pod \"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc\" (UID: \"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc\") " Mar 10 14:34:04 crc kubenswrapper[4911]: I0310 14:34:04.650282 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91a7cec-4b2e-4ec7-bb08-f5d6043394cc-kube-api-access-kpq95" (OuterVolumeSpecName: "kube-api-access-kpq95") pod "e91a7cec-4b2e-4ec7-bb08-f5d6043394cc" (UID: "e91a7cec-4b2e-4ec7-bb08-f5d6043394cc"). InnerVolumeSpecName "kube-api-access-kpq95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:34:04 crc kubenswrapper[4911]: I0310 14:34:04.745983 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpq95\" (UniqueName: \"kubernetes.io/projected/e91a7cec-4b2e-4ec7-bb08-f5d6043394cc-kube-api-access-kpq95\") on node \"crc\" DevicePath \"\"" Mar 10 14:34:05 crc kubenswrapper[4911]: I0310 14:34:05.208903 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552554-t65c4" Mar 10 14:34:05 crc kubenswrapper[4911]: I0310 14:34:05.208888 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552554-t65c4" event={"ID":"e91a7cec-4b2e-4ec7-bb08-f5d6043394cc","Type":"ContainerDied","Data":"43223fc875dee9ef64f226f8344b56a511547673a6438750991014b906e39338"} Mar 10 14:34:05 crc kubenswrapper[4911]: I0310 14:34:05.209406 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43223fc875dee9ef64f226f8344b56a511547673a6438750991014b906e39338" Mar 10 14:34:05 crc kubenswrapper[4911]: I0310 14:34:05.217296 4911 generic.go:334] "Generic (PLEG): container finished" podID="37e4aed4-039e-4b2b-89d7-65c43eb8f688" containerID="a1fc93cd561ee81dd705a9c871f0715d6129ae4a0df1ac57a11d897fccddfeac" exitCode=0 Mar 10 14:34:05 crc kubenswrapper[4911]: I0310 14:34:05.217349 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" event={"ID":"37e4aed4-039e-4b2b-89d7-65c43eb8f688","Type":"ContainerDied","Data":"a1fc93cd561ee81dd705a9c871f0715d6129ae4a0df1ac57a11d897fccddfeac"} Mar 10 14:34:05 crc kubenswrapper[4911]: I0310 14:34:05.267654 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552548-4fdgg"] Mar 10 14:34:05 crc kubenswrapper[4911]: I0310 14:34:05.277460 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552548-4fdgg"] Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.037255 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-85swm"] Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.046689 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-85swm"] Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.210716 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d8f8654-0a72-4998-822d-7fd5d24a487e" path="/var/lib/kubelet/pods/0d8f8654-0a72-4998-822d-7fd5d24a487e/volumes" Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.211879 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b878c7-d1cf-4656-8762-7be57cf1491a" path="/var/lib/kubelet/pods/67b878c7-d1cf-4656-8762-7be57cf1491a/volumes" Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.656275 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.789943 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-ssh-key-openstack-edpm-ipam\") pod \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.790054 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-inventory\") pod \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.790329 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knrfw\" (UniqueName: \"kubernetes.io/projected/37e4aed4-039e-4b2b-89d7-65c43eb8f688-kube-api-access-knrfw\") pod \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\" (UID: \"37e4aed4-039e-4b2b-89d7-65c43eb8f688\") " Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.797526 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e4aed4-039e-4b2b-89d7-65c43eb8f688-kube-api-access-knrfw" (OuterVolumeSpecName: "kube-api-access-knrfw") pod "37e4aed4-039e-4b2b-89d7-65c43eb8f688" (UID: "37e4aed4-039e-4b2b-89d7-65c43eb8f688"). InnerVolumeSpecName "kube-api-access-knrfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.820886 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-inventory" (OuterVolumeSpecName: "inventory") pod "37e4aed4-039e-4b2b-89d7-65c43eb8f688" (UID: "37e4aed4-039e-4b2b-89d7-65c43eb8f688"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.832090 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "37e4aed4-039e-4b2b-89d7-65c43eb8f688" (UID: "37e4aed4-039e-4b2b-89d7-65c43eb8f688"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.893820 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knrfw\" (UniqueName: \"kubernetes.io/projected/37e4aed4-039e-4b2b-89d7-65c43eb8f688-kube-api-access-knrfw\") on node \"crc\" DevicePath \"\"" Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.893882 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:34:06 crc kubenswrapper[4911]: I0310 14:34:06.893901 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e4aed4-039e-4b2b-89d7-65c43eb8f688-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.253065 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" event={"ID":"37e4aed4-039e-4b2b-89d7-65c43eb8f688","Type":"ContainerDied","Data":"69586b536fe0bda40e744761e30fced3a1f0ad2bbff68f0b706c44b4fa4f39e3"} Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.253119 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69586b536fe0bda40e744761e30fced3a1f0ad2bbff68f0b706c44b4fa4f39e3" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.253197 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.332496 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx"] Mar 10 14:34:07 crc kubenswrapper[4911]: E0310 14:34:07.333393 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91a7cec-4b2e-4ec7-bb08-f5d6043394cc" containerName="oc" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.333479 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91a7cec-4b2e-4ec7-bb08-f5d6043394cc" containerName="oc" Mar 10 14:34:07 crc kubenswrapper[4911]: E0310 14:34:07.333546 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e4aed4-039e-4b2b-89d7-65c43eb8f688" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.333601 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e4aed4-039e-4b2b-89d7-65c43eb8f688" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.333933 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e4aed4-039e-4b2b-89d7-65c43eb8f688" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.334030 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91a7cec-4b2e-4ec7-bb08-f5d6043394cc" containerName="oc" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.335067 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.337382 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.337502 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.338003 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.339065 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.342567 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx"] Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.406035 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.406086 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krd77\" (UniqueName: \"kubernetes.io/projected/7ea3ab89-1a92-47f9-85a5-3df48990343b-kube-api-access-krd77\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.406271 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.508433 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.508566 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.508604 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krd77\" (UniqueName: \"kubernetes.io/projected/7ea3ab89-1a92-47f9-85a5-3df48990343b-kube-api-access-krd77\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.512836 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.517489 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.527271 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krd77\" (UniqueName: \"kubernetes.io/projected/7ea3ab89-1a92-47f9-85a5-3df48990343b-kube-api-access-krd77\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstmx\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:07 crc kubenswrapper[4911]: I0310 14:34:07.660468 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:34:08 crc kubenswrapper[4911]: I0310 14:34:08.207811 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx"] Mar 10 14:34:08 crc kubenswrapper[4911]: I0310 14:34:08.266127 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" event={"ID":"7ea3ab89-1a92-47f9-85a5-3df48990343b","Type":"ContainerStarted","Data":"66d4885cf24ad24fd8e10c9d7519a8ed46cdd4782afe66056db8739f83a4a01c"} Mar 10 14:34:09 crc kubenswrapper[4911]: I0310 14:34:09.276410 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" event={"ID":"7ea3ab89-1a92-47f9-85a5-3df48990343b","Type":"ContainerStarted","Data":"70928cbf453a7962cd453ec8ace0a60115eea0404bb49f8ca5fc72493be86fb4"} Mar 10 14:34:09 crc kubenswrapper[4911]: I0310 14:34:09.299455 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" podStartSLOduration=1.894797665 podStartE2EDuration="2.299414828s" podCreationTimestamp="2026-03-10 14:34:07 +0000 UTC" firstStartedPulling="2026-03-10 14:34:08.217011147 +0000 UTC m=+1952.780531064" lastFinishedPulling="2026-03-10 14:34:08.62162831 +0000 UTC m=+1953.185148227" observedRunningTime="2026-03-10 14:34:09.292916956 +0000 UTC m=+1953.856436873" watchObservedRunningTime="2026-03-10 14:34:09.299414828 +0000 UTC m=+1953.862934745" Mar 10 14:34:14 crc kubenswrapper[4911]: I0310 14:34:14.049496 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-k62d9"] Mar 10 14:34:14 crc kubenswrapper[4911]: I0310 14:34:14.063942 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-k62d9"] Mar 10 14:34:14 crc kubenswrapper[4911]: I0310 14:34:14.235417 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9942f116-fd81-4e92-bd0f-add9b12b4c08" path="/var/lib/kubelet/pods/9942f116-fd81-4e92-bd0f-add9b12b4c08/volumes" Mar 10 14:34:54 crc kubenswrapper[4911]: I0310 14:34:54.877117 4911 scope.go:117] "RemoveContainer" containerID="a7685173559e3dc24a03bb6e10a53a23002da938421bb4502acd6723ddc4e6d2" Mar 10 14:34:54 crc kubenswrapper[4911]: I0310 14:34:54.922660 4911 scope.go:117] "RemoveContainer" containerID="6ac24e83b1bcafb3912912c9a8d36f42c0cd46cc3cea4ee86def16e037d476d7" Mar 10 14:34:54 crc kubenswrapper[4911]: I0310 14:34:54.968387 4911 scope.go:117] "RemoveContainer" containerID="88605f944fe9db9675ed52044c08972beadbb1e0b2b3b2f7ee7dec705d09982f" Mar 10 14:34:55 crc kubenswrapper[4911]: I0310 14:34:55.023585 4911 scope.go:117] "RemoveContainer" containerID="9061b4e578dbba907356514f9fc9ffcb1738d5dff660b5b603fe293f8737ed79" Mar 10 14:34:55 crc kubenswrapper[4911]: I0310 14:34:55.048384 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-lwpcs"] Mar 10 14:34:55 crc kubenswrapper[4911]: I0310 14:34:55.060933 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-lwpcs"] Mar 10 14:34:55 crc kubenswrapper[4911]: I0310 14:34:55.087956 4911 scope.go:117] "RemoveContainer" containerID="b565c79511677b5104e61adf04d1b88b0ff1a7e493a7f6e74fe5f320924efb13" Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.043771 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-wdsg2"] Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.062223 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c623-account-create-update-pzwxt"] Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.072006 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-m9ftl"] Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.081539 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c623-account-create-update-pzwxt"] Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.090236 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-m9ftl"] Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.098878 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-wdsg2"] Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.206234 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e0ca1a-a12e-45bd-8adc-3852606ec8a9" path="/var/lib/kubelet/pods/34e0ca1a-a12e-45bd-8adc-3852606ec8a9/volumes" Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.207055 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e63a1e-e526-498d-b14f-0720657e6c30" path="/var/lib/kubelet/pods/41e63a1e-e526-498d-b14f-0720657e6c30/volumes" Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.207726 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b02600-8a57-4944-9f93-cdf74f7dad84" path="/var/lib/kubelet/pods/e0b02600-8a57-4944-9f93-cdf74f7dad84/volumes" Mar 10 14:34:56 crc kubenswrapper[4911]: I0310 14:34:56.208608 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b4b396-5ac9-490b-a59e-b568ca3cb638" path="/var/lib/kubelet/pods/f9b4b396-5ac9-490b-a59e-b568ca3cb638/volumes" Mar 10 14:34:57 crc kubenswrapper[4911]: I0310 14:34:57.032075 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5c94-account-create-update-9z2xg"] Mar 10 14:34:57 crc kubenswrapper[4911]: I0310 14:34:57.042364 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d929-account-create-update-z2j9n"] Mar 10 14:34:57 crc kubenswrapper[4911]: I0310 14:34:57.050637 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5c94-account-create-update-9z2xg"] Mar 10 14:34:57 crc kubenswrapper[4911]: I0310 14:34:57.059743 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d929-account-create-update-z2j9n"] Mar 10 14:34:58 crc kubenswrapper[4911]: I0310 14:34:58.210653 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20d998cd-6fc6-498b-9a65-18dbf5931cc3" path="/var/lib/kubelet/pods/20d998cd-6fc6-498b-9a65-18dbf5931cc3/volumes" Mar 10 14:34:58 crc kubenswrapper[4911]: I0310 14:34:58.211468 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e53f4fb-9bff-46a8-a447-e19ee4777c3b" path="/var/lib/kubelet/pods/2e53f4fb-9bff-46a8-a447-e19ee4777c3b/volumes" Mar 10 14:35:13 crc kubenswrapper[4911]: I0310 14:35:13.954899 4911 generic.go:334] "Generic (PLEG): container finished" podID="7ea3ab89-1a92-47f9-85a5-3df48990343b" containerID="70928cbf453a7962cd453ec8ace0a60115eea0404bb49f8ca5fc72493be86fb4" exitCode=0 Mar 10 14:35:13 crc kubenswrapper[4911]: I0310 14:35:13.954971 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" event={"ID":"7ea3ab89-1a92-47f9-85a5-3df48990343b","Type":"ContainerDied","Data":"70928cbf453a7962cd453ec8ace0a60115eea0404bb49f8ca5fc72493be86fb4"} Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.400228 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.440288 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-ssh-key-openstack-edpm-ipam\") pod \"7ea3ab89-1a92-47f9-85a5-3df48990343b\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.440394 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-inventory\") pod \"7ea3ab89-1a92-47f9-85a5-3df48990343b\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.441274 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krd77\" (UniqueName: \"kubernetes.io/projected/7ea3ab89-1a92-47f9-85a5-3df48990343b-kube-api-access-krd77\") pod \"7ea3ab89-1a92-47f9-85a5-3df48990343b\" (UID: \"7ea3ab89-1a92-47f9-85a5-3df48990343b\") " Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.445991 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea3ab89-1a92-47f9-85a5-3df48990343b-kube-api-access-krd77" (OuterVolumeSpecName: "kube-api-access-krd77") pod "7ea3ab89-1a92-47f9-85a5-3df48990343b" (UID: "7ea3ab89-1a92-47f9-85a5-3df48990343b"). InnerVolumeSpecName "kube-api-access-krd77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.468128 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7ea3ab89-1a92-47f9-85a5-3df48990343b" (UID: "7ea3ab89-1a92-47f9-85a5-3df48990343b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.472906 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-inventory" (OuterVolumeSpecName: "inventory") pod "7ea3ab89-1a92-47f9-85a5-3df48990343b" (UID: "7ea3ab89-1a92-47f9-85a5-3df48990343b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.544186 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.544238 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ea3ab89-1a92-47f9-85a5-3df48990343b-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.544251 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krd77\" (UniqueName: \"kubernetes.io/projected/7ea3ab89-1a92-47f9-85a5-3df48990343b-kube-api-access-krd77\") on node \"crc\" DevicePath \"\"" Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.975144 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" event={"ID":"7ea3ab89-1a92-47f9-85a5-3df48990343b","Type":"ContainerDied","Data":"66d4885cf24ad24fd8e10c9d7519a8ed46cdd4782afe66056db8739f83a4a01c"} Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.975502 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d4885cf24ad24fd8e10c9d7519a8ed46cdd4782afe66056db8739f83a4a01c" Mar 10 14:35:15 crc kubenswrapper[4911]: I0310 14:35:15.975221 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstmx" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.088430 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn"] Mar 10 14:35:16 crc kubenswrapper[4911]: E0310 14:35:16.088917 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea3ab89-1a92-47f9-85a5-3df48990343b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.088936 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea3ab89-1a92-47f9-85a5-3df48990343b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.089142 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea3ab89-1a92-47f9-85a5-3df48990343b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.090968 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.093813 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.094045 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.094168 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.094914 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.100273 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn"] Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.158521 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5r7k\" (UniqueName: \"kubernetes.io/projected/94cde38e-e826-4cad-9f7a-55e42ec4964a-kube-api-access-v5r7k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.158692 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.158806 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.261178 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.261343 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5r7k\" (UniqueName: \"kubernetes.io/projected/94cde38e-e826-4cad-9f7a-55e42ec4964a-kube-api-access-v5r7k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.261464 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.270238 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.275059 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.278981 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5r7k\" (UniqueName: \"kubernetes.io/projected/94cde38e-e826-4cad-9f7a-55e42ec4964a-kube-api-access-v5r7k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nttsn\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.411132 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.947272 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn"] Mar 10 14:35:16 crc kubenswrapper[4911]: I0310 14:35:16.985684 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" event={"ID":"94cde38e-e826-4cad-9f7a-55e42ec4964a","Type":"ContainerStarted","Data":"c6874f7367213956f340f87d044bd755d0a47107070cd89fd78b2d3c3e85d0c5"} Mar 10 14:35:17 crc kubenswrapper[4911]: I0310 14:35:17.995091 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" event={"ID":"94cde38e-e826-4cad-9f7a-55e42ec4964a","Type":"ContainerStarted","Data":"409746fe4300f574af5a1e31e71b246c6d38c467b9cb34320657d2d3bea0148f"} Mar 10 14:35:18 crc kubenswrapper[4911]: I0310 14:35:18.020508 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" podStartSLOduration=1.832258796 podStartE2EDuration="2.020478168s" podCreationTimestamp="2026-03-10 14:35:16 +0000 UTC" firstStartedPulling="2026-03-10 14:35:16.952252084 +0000 UTC m=+2021.515772001" lastFinishedPulling="2026-03-10 14:35:17.140471446 +0000 UTC m=+2021.703991373" observedRunningTime="2026-03-10 14:35:18.01451428 +0000 UTC m=+2022.578034217" watchObservedRunningTime="2026-03-10 14:35:18.020478168 +0000 UTC m=+2022.583998085" Mar 10 14:35:22 crc kubenswrapper[4911]: I0310 14:35:22.046295 4911 generic.go:334] "Generic (PLEG): container finished" podID="94cde38e-e826-4cad-9f7a-55e42ec4964a" containerID="409746fe4300f574af5a1e31e71b246c6d38c467b9cb34320657d2d3bea0148f" exitCode=0 Mar 10 14:35:22 crc kubenswrapper[4911]: I0310 14:35:22.046375 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" event={"ID":"94cde38e-e826-4cad-9f7a-55e42ec4964a","Type":"ContainerDied","Data":"409746fe4300f574af5a1e31e71b246c6d38c467b9cb34320657d2d3bea0148f"} Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.468483 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.632465 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5r7k\" (UniqueName: \"kubernetes.io/projected/94cde38e-e826-4cad-9f7a-55e42ec4964a-kube-api-access-v5r7k\") pod \"94cde38e-e826-4cad-9f7a-55e42ec4964a\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.632666 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-ssh-key-openstack-edpm-ipam\") pod \"94cde38e-e826-4cad-9f7a-55e42ec4964a\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.632939 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-inventory\") pod \"94cde38e-e826-4cad-9f7a-55e42ec4964a\" (UID: \"94cde38e-e826-4cad-9f7a-55e42ec4964a\") " Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.639188 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cde38e-e826-4cad-9f7a-55e42ec4964a-kube-api-access-v5r7k" (OuterVolumeSpecName: "kube-api-access-v5r7k") pod "94cde38e-e826-4cad-9f7a-55e42ec4964a" (UID: "94cde38e-e826-4cad-9f7a-55e42ec4964a"). InnerVolumeSpecName "kube-api-access-v5r7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.663053 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-inventory" (OuterVolumeSpecName: "inventory") pod "94cde38e-e826-4cad-9f7a-55e42ec4964a" (UID: "94cde38e-e826-4cad-9f7a-55e42ec4964a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.663549 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "94cde38e-e826-4cad-9f7a-55e42ec4964a" (UID: "94cde38e-e826-4cad-9f7a-55e42ec4964a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.740223 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.740279 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5r7k\" (UniqueName: \"kubernetes.io/projected/94cde38e-e826-4cad-9f7a-55e42ec4964a-kube-api-access-v5r7k\") on node \"crc\" DevicePath \"\"" Mar 10 14:35:23 crc kubenswrapper[4911]: I0310 14:35:23.740294 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cde38e-e826-4cad-9f7a-55e42ec4964a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.067344 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" event={"ID":"94cde38e-e826-4cad-9f7a-55e42ec4964a","Type":"ContainerDied","Data":"c6874f7367213956f340f87d044bd755d0a47107070cd89fd78b2d3c3e85d0c5"} Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.067798 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6874f7367213956f340f87d044bd755d0a47107070cd89fd78b2d3c3e85d0c5" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.067466 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nttsn" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.146596 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h"] Mar 10 14:35:24 crc kubenswrapper[4911]: E0310 14:35:24.147133 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cde38e-e826-4cad-9f7a-55e42ec4964a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.147158 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cde38e-e826-4cad-9f7a-55e42ec4964a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.147355 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cde38e-e826-4cad-9f7a-55e42ec4964a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.148178 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.150445 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.150756 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.151104 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.151148 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.165427 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h"] Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.250495 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.353580 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.354088 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mngs9\" (UniqueName: \"kubernetes.io/projected/6c29812e-9268-4508-aef7-cb43fe278c8d-kube-api-access-mngs9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.354302 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.359892 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.455741 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.455848 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mngs9\" (UniqueName: \"kubernetes.io/projected/6c29812e-9268-4508-aef7-cb43fe278c8d-kube-api-access-mngs9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.460604 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.479353 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mngs9\" (UniqueName: \"kubernetes.io/projected/6c29812e-9268-4508-aef7-cb43fe278c8d-kube-api-access-mngs9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9w8h\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:24 crc kubenswrapper[4911]: I0310 14:35:24.764962 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:35:25 crc kubenswrapper[4911]: I0310 14:35:25.331891 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h"] Mar 10 14:35:26 crc kubenswrapper[4911]: I0310 14:35:26.067788 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ls7n2"] Mar 10 14:35:26 crc kubenswrapper[4911]: I0310 14:35:26.081855 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ls7n2"] Mar 10 14:35:26 crc kubenswrapper[4911]: I0310 14:35:26.089898 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" event={"ID":"6c29812e-9268-4508-aef7-cb43fe278c8d","Type":"ContainerStarted","Data":"d93eaf7e36dcbafc44267d9001324aab599b2cd7c82d8ccf063a70d8ecd086eb"} Mar 10 14:35:26 crc kubenswrapper[4911]: I0310 14:35:26.089949 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" event={"ID":"6c29812e-9268-4508-aef7-cb43fe278c8d","Type":"ContainerStarted","Data":"a9c039a9654f08e4dd65a1ae1a95e3d8ec5b82711443938cc4e022615c2bc9db"} Mar 10 14:35:26 crc kubenswrapper[4911]: I0310 14:35:26.123170 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" podStartSLOduration=1.953393991 podStartE2EDuration="2.123144663s" podCreationTimestamp="2026-03-10 14:35:24 +0000 UTC" firstStartedPulling="2026-03-10 14:35:25.33885411 +0000 UTC m=+2029.902374037" lastFinishedPulling="2026-03-10 14:35:25.508604792 +0000 UTC m=+2030.072124709" observedRunningTime="2026-03-10 14:35:26.106274476 +0000 UTC m=+2030.669794393" watchObservedRunningTime="2026-03-10 14:35:26.123144663 +0000 UTC m=+2030.686664580" Mar 10 14:35:26 crc kubenswrapper[4911]: I0310 14:35:26.208358 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="022d43c3-2aa6-4720-9bc2-c79662f9ec3c" path="/var/lib/kubelet/pods/022d43c3-2aa6-4720-9bc2-c79662f9ec3c/volumes" Mar 10 14:35:48 crc kubenswrapper[4911]: I0310 14:35:48.521089 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:35:48 crc kubenswrapper[4911]: I0310 14:35:48.521683 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:35:55 crc kubenswrapper[4911]: I0310 14:35:55.214432 4911 scope.go:117] "RemoveContainer" containerID="4a8a84684bed8d4e40666e945d53ca5a61b3dfc8972b3eb66b59e700a01915f0" Mar 10 14:35:55 crc kubenswrapper[4911]: I0310 14:35:55.239380 4911 scope.go:117] "RemoveContainer" containerID="07eb6bb91dfc269f60ea6291d5447ac5382f38d6404f50e03dd2d82adf7063f9" Mar 10 14:35:55 crc kubenswrapper[4911]: I0310 14:35:55.309375 4911 scope.go:117] "RemoveContainer" containerID="c1064abdb392bc6ef5a31b8771d7552400a6cb1290ede03b00c41caa11337366" Mar 10 14:35:55 crc kubenswrapper[4911]: I0310 14:35:55.330355 4911 scope.go:117] "RemoveContainer" containerID="73f076ceb280a0cd17667cc0386ae710c3b52ce5e9251dc76fbc7f827913d7cd" Mar 10 14:35:55 crc kubenswrapper[4911]: I0310 14:35:55.374048 4911 scope.go:117] "RemoveContainer" containerID="051e07b5e5787eba2f639031817bf7c502ed899b72406e551a99d22edbd6ecc6" Mar 10 14:35:55 crc kubenswrapper[4911]: I0310 14:35:55.444642 4911 scope.go:117] "RemoveContainer" containerID="bcbc688113f8cb2ee00b0d735b3165ebf42dac0e7b4cddd1e0c84076a0695e3b" Mar 10 14:35:55 crc kubenswrapper[4911]: I0310 14:35:55.467784 4911 scope.go:117] "RemoveContainer" containerID="0f625dededc2b14e4a81dee42d61239a44f8452622e2682478cf067aa3a2b125" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.153063 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552556-sltw5"] Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.155147 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552556-sltw5" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.157867 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.158345 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.162570 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.163924 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552556-sltw5"] Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.237641 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9vj\" (UniqueName: \"kubernetes.io/projected/0c9dba33-a3f9-4a30-9c5a-8e9871a09592-kube-api-access-jd9vj\") pod \"auto-csr-approver-29552556-sltw5\" (UID: \"0c9dba33-a3f9-4a30-9c5a-8e9871a09592\") " pod="openshift-infra/auto-csr-approver-29552556-sltw5" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.339563 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd9vj\" (UniqueName: \"kubernetes.io/projected/0c9dba33-a3f9-4a30-9c5a-8e9871a09592-kube-api-access-jd9vj\") pod \"auto-csr-approver-29552556-sltw5\" (UID: \"0c9dba33-a3f9-4a30-9c5a-8e9871a09592\") " pod="openshift-infra/auto-csr-approver-29552556-sltw5" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.361746 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd9vj\" (UniqueName: \"kubernetes.io/projected/0c9dba33-a3f9-4a30-9c5a-8e9871a09592-kube-api-access-jd9vj\") pod \"auto-csr-approver-29552556-sltw5\" (UID: \"0c9dba33-a3f9-4a30-9c5a-8e9871a09592\") " pod="openshift-infra/auto-csr-approver-29552556-sltw5" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.455734 4911 generic.go:334] "Generic (PLEG): container finished" podID="6c29812e-9268-4508-aef7-cb43fe278c8d" containerID="d93eaf7e36dcbafc44267d9001324aab599b2cd7c82d8ccf063a70d8ecd086eb" exitCode=0 Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.455785 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" event={"ID":"6c29812e-9268-4508-aef7-cb43fe278c8d","Type":"ContainerDied","Data":"d93eaf7e36dcbafc44267d9001324aab599b2cd7c82d8ccf063a70d8ecd086eb"} Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.482568 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552556-sltw5" Mar 10 14:36:00 crc kubenswrapper[4911]: I0310 14:36:00.953481 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552556-sltw5"] Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.289667 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-92bdn"] Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.292039 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.304002 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92bdn"] Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.359253 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-catalog-content\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.359372 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-utilities\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.359414 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jccc\" (UniqueName: \"kubernetes.io/projected/1a61a032-3087-4a90-ba9e-bb6689d8ef08-kube-api-access-6jccc\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.460978 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-utilities\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.461048 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jccc\" (UniqueName: \"kubernetes.io/projected/1a61a032-3087-4a90-ba9e-bb6689d8ef08-kube-api-access-6jccc\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.461159 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-catalog-content\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.461872 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-catalog-content\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.462419 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-utilities\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.469990 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552556-sltw5" event={"ID":"0c9dba33-a3f9-4a30-9c5a-8e9871a09592","Type":"ContainerStarted","Data":"b6322a480f818c3de2b3784b8b298b74a1855864d7a820d40da97bfe27ba7f63"} Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.484757 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jccc\" (UniqueName: \"kubernetes.io/projected/1a61a032-3087-4a90-ba9e-bb6689d8ef08-kube-api-access-6jccc\") pod \"redhat-operators-92bdn\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:01 crc kubenswrapper[4911]: I0310 14:36:01.620053 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.006989 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.140020 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92bdn"] Mar 10 14:36:02 crc kubenswrapper[4911]: W0310 14:36:02.144480 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a61a032_3087_4a90_ba9e_bb6689d8ef08.slice/crio-4a01d9841219ee40ed010a564abd8c6fa7022fac086a3aed0a663b3ff5f2a8a0 WatchSource:0}: Error finding container 4a01d9841219ee40ed010a564abd8c6fa7022fac086a3aed0a663b3ff5f2a8a0: Status 404 returned error can't find the container with id 4a01d9841219ee40ed010a564abd8c6fa7022fac086a3aed0a663b3ff5f2a8a0 Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.206173 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-ssh-key-openstack-edpm-ipam\") pod \"6c29812e-9268-4508-aef7-cb43fe278c8d\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.206318 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mngs9\" (UniqueName: \"kubernetes.io/projected/6c29812e-9268-4508-aef7-cb43fe278c8d-kube-api-access-mngs9\") pod \"6c29812e-9268-4508-aef7-cb43fe278c8d\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.206415 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-inventory\") pod \"6c29812e-9268-4508-aef7-cb43fe278c8d\" (UID: \"6c29812e-9268-4508-aef7-cb43fe278c8d\") " Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.243624 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c29812e-9268-4508-aef7-cb43fe278c8d-kube-api-access-mngs9" (OuterVolumeSpecName: "kube-api-access-mngs9") pod "6c29812e-9268-4508-aef7-cb43fe278c8d" (UID: "6c29812e-9268-4508-aef7-cb43fe278c8d"). InnerVolumeSpecName "kube-api-access-mngs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.258048 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6c29812e-9268-4508-aef7-cb43fe278c8d" (UID: "6c29812e-9268-4508-aef7-cb43fe278c8d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.291484 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-inventory" (OuterVolumeSpecName: "inventory") pod "6c29812e-9268-4508-aef7-cb43fe278c8d" (UID: "6c29812e-9268-4508-aef7-cb43fe278c8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.318605 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.318659 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mngs9\" (UniqueName: \"kubernetes.io/projected/6c29812e-9268-4508-aef7-cb43fe278c8d-kube-api-access-mngs9\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.318750 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c29812e-9268-4508-aef7-cb43fe278c8d-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.481387 4911 generic.go:334] "Generic (PLEG): container finished" podID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerID="3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c" exitCode=0 Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.481496 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92bdn" event={"ID":"1a61a032-3087-4a90-ba9e-bb6689d8ef08","Type":"ContainerDied","Data":"3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c"} Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.481569 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92bdn" event={"ID":"1a61a032-3087-4a90-ba9e-bb6689d8ef08","Type":"ContainerStarted","Data":"4a01d9841219ee40ed010a564abd8c6fa7022fac086a3aed0a663b3ff5f2a8a0"} Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.484393 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552556-sltw5" event={"ID":"0c9dba33-a3f9-4a30-9c5a-8e9871a09592","Type":"ContainerStarted","Data":"894d55192e067d421967c5bec84669bcbfa6e647b27d3ec2a50d8f86b98f1dfe"} Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.486504 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" event={"ID":"6c29812e-9268-4508-aef7-cb43fe278c8d","Type":"ContainerDied","Data":"a9c039a9654f08e4dd65a1ae1a95e3d8ec5b82711443938cc4e022615c2bc9db"} Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.486537 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9c039a9654f08e4dd65a1ae1a95e3d8ec5b82711443938cc4e022615c2bc9db" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.486606 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9w8h" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.536071 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552556-sltw5" podStartSLOduration=1.526419323 podStartE2EDuration="2.536039644s" podCreationTimestamp="2026-03-10 14:36:00 +0000 UTC" firstStartedPulling="2026-03-10 14:36:00.953986749 +0000 UTC m=+2065.517506666" lastFinishedPulling="2026-03-10 14:36:01.96360707 +0000 UTC m=+2066.527126987" observedRunningTime="2026-03-10 14:36:02.533157257 +0000 UTC m=+2067.096677184" watchObservedRunningTime="2026-03-10 14:36:02.536039644 +0000 UTC m=+2067.099559561" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.585637 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct"] Mar 10 14:36:02 crc kubenswrapper[4911]: E0310 14:36:02.586215 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c29812e-9268-4508-aef7-cb43fe278c8d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.586238 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c29812e-9268-4508-aef7-cb43fe278c8d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.586465 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c29812e-9268-4508-aef7-cb43fe278c8d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.587297 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.594558 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.594892 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.595098 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.595247 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.618652 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct"] Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.623202 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.623278 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtdfw\" (UniqueName: \"kubernetes.io/projected/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-kube-api-access-jtdfw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.623324 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.725292 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.725407 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtdfw\" (UniqueName: \"kubernetes.io/projected/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-kube-api-access-jtdfw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.725873 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.732199 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.732292 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.754275 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtdfw\" (UniqueName: \"kubernetes.io/projected/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-kube-api-access-jtdfw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:02 crc kubenswrapper[4911]: I0310 14:36:02.931324 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:03 crc kubenswrapper[4911]: I0310 14:36:03.491764 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct"] Mar 10 14:36:03 crc kubenswrapper[4911]: I0310 14:36:03.500258 4911 generic.go:334] "Generic (PLEG): container finished" podID="0c9dba33-a3f9-4a30-9c5a-8e9871a09592" containerID="894d55192e067d421967c5bec84669bcbfa6e647b27d3ec2a50d8f86b98f1dfe" exitCode=0 Mar 10 14:36:03 crc kubenswrapper[4911]: I0310 14:36:03.500322 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552556-sltw5" event={"ID":"0c9dba33-a3f9-4a30-9c5a-8e9871a09592","Type":"ContainerDied","Data":"894d55192e067d421967c5bec84669bcbfa6e647b27d3ec2a50d8f86b98f1dfe"} Mar 10 14:36:04 crc kubenswrapper[4911]: I0310 14:36:04.513364 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92bdn" event={"ID":"1a61a032-3087-4a90-ba9e-bb6689d8ef08","Type":"ContainerStarted","Data":"a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8"} Mar 10 14:36:04 crc kubenswrapper[4911]: I0310 14:36:04.515915 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" event={"ID":"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2","Type":"ContainerStarted","Data":"45d075b84e30c3a46fbe97c542fb58536416ab0e14726d56cff58867bb81d105"} Mar 10 14:36:04 crc kubenswrapper[4911]: I0310 14:36:04.516005 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" event={"ID":"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2","Type":"ContainerStarted","Data":"1cc5bb930510ab0d8128edeb9bbf08f4a6671942f81606f099a3dfd6d335f38a"} Mar 10 14:36:04 crc kubenswrapper[4911]: I0310 14:36:04.560041 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" podStartSLOduration=2.387367871 podStartE2EDuration="2.56001534s" podCreationTimestamp="2026-03-10 14:36:02 +0000 UTC" firstStartedPulling="2026-03-10 14:36:03.50061121 +0000 UTC m=+2068.064131127" lastFinishedPulling="2026-03-10 14:36:03.673258679 +0000 UTC m=+2068.236778596" observedRunningTime="2026-03-10 14:36:04.552896711 +0000 UTC m=+2069.116416638" watchObservedRunningTime="2026-03-10 14:36:04.56001534 +0000 UTC m=+2069.123535257" Mar 10 14:36:04 crc kubenswrapper[4911]: I0310 14:36:04.881137 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552556-sltw5" Mar 10 14:36:04 crc kubenswrapper[4911]: I0310 14:36:04.886686 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd9vj\" (UniqueName: \"kubernetes.io/projected/0c9dba33-a3f9-4a30-9c5a-8e9871a09592-kube-api-access-jd9vj\") pod \"0c9dba33-a3f9-4a30-9c5a-8e9871a09592\" (UID: \"0c9dba33-a3f9-4a30-9c5a-8e9871a09592\") " Mar 10 14:36:04 crc kubenswrapper[4911]: I0310 14:36:04.892588 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9dba33-a3f9-4a30-9c5a-8e9871a09592-kube-api-access-jd9vj" (OuterVolumeSpecName: "kube-api-access-jd9vj") pod "0c9dba33-a3f9-4a30-9c5a-8e9871a09592" (UID: "0c9dba33-a3f9-4a30-9c5a-8e9871a09592"). InnerVolumeSpecName "kube-api-access-jd9vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:36:04 crc kubenswrapper[4911]: I0310 14:36:04.989554 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd9vj\" (UniqueName: \"kubernetes.io/projected/0c9dba33-a3f9-4a30-9c5a-8e9871a09592-kube-api-access-jd9vj\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:05 crc kubenswrapper[4911]: I0310 14:36:05.526301 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552556-sltw5" event={"ID":"0c9dba33-a3f9-4a30-9c5a-8e9871a09592","Type":"ContainerDied","Data":"b6322a480f818c3de2b3784b8b298b74a1855864d7a820d40da97bfe27ba7f63"} Mar 10 14:36:05 crc kubenswrapper[4911]: I0310 14:36:05.527004 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6322a480f818c3de2b3784b8b298b74a1855864d7a820d40da97bfe27ba7f63" Mar 10 14:36:05 crc kubenswrapper[4911]: I0310 14:36:05.526332 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552556-sltw5" Mar 10 14:36:05 crc kubenswrapper[4911]: I0310 14:36:05.965044 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552550-lt684"] Mar 10 14:36:05 crc kubenswrapper[4911]: I0310 14:36:05.973825 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552550-lt684"] Mar 10 14:36:06 crc kubenswrapper[4911]: I0310 14:36:06.211009 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20567513-e7de-4124-aa52-e707e4a5eb14" path="/var/lib/kubelet/pods/20567513-e7de-4124-aa52-e707e4a5eb14/volumes" Mar 10 14:36:06 crc kubenswrapper[4911]: I0310 14:36:06.541844 4911 generic.go:334] "Generic (PLEG): container finished" podID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerID="a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8" exitCode=0 Mar 10 14:36:06 crc kubenswrapper[4911]: I0310 14:36:06.541962 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92bdn" event={"ID":"1a61a032-3087-4a90-ba9e-bb6689d8ef08","Type":"ContainerDied","Data":"a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8"} Mar 10 14:36:07 crc kubenswrapper[4911]: I0310 14:36:07.555396 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92bdn" event={"ID":"1a61a032-3087-4a90-ba9e-bb6689d8ef08","Type":"ContainerStarted","Data":"abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3"} Mar 10 14:36:07 crc kubenswrapper[4911]: I0310 14:36:07.585152 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-92bdn" podStartSLOduration=2.04486187 podStartE2EDuration="6.5851037s" podCreationTimestamp="2026-03-10 14:36:01 +0000 UTC" firstStartedPulling="2026-03-10 14:36:02.483984523 +0000 UTC m=+2067.047504460" lastFinishedPulling="2026-03-10 14:36:07.024226373 +0000 UTC m=+2071.587746290" observedRunningTime="2026-03-10 14:36:07.580653209 +0000 UTC m=+2072.144173126" watchObservedRunningTime="2026-03-10 14:36:07.5851037 +0000 UTC m=+2072.148623617" Mar 10 14:36:11 crc kubenswrapper[4911]: I0310 14:36:11.621148 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:11 crc kubenswrapper[4911]: I0310 14:36:11.621214 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:12 crc kubenswrapper[4911]: I0310 14:36:12.673524 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-92bdn" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="registry-server" probeResult="failure" output=< Mar 10 14:36:12 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 14:36:12 crc kubenswrapper[4911]: > Mar 10 14:36:18 crc kubenswrapper[4911]: I0310 14:36:18.520311 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:36:18 crc kubenswrapper[4911]: I0310 14:36:18.520808 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:36:21 crc kubenswrapper[4911]: I0310 14:36:21.669380 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:21 crc kubenswrapper[4911]: I0310 14:36:21.721253 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:21 crc kubenswrapper[4911]: I0310 14:36:21.913756 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92bdn"] Mar 10 14:36:22 crc kubenswrapper[4911]: I0310 14:36:22.701743 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-92bdn" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="registry-server" containerID="cri-o://abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3" gracePeriod=2 Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.171231 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.273070 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-catalog-content\") pod \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.273356 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-utilities\") pod \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.273425 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jccc\" (UniqueName: \"kubernetes.io/projected/1a61a032-3087-4a90-ba9e-bb6689d8ef08-kube-api-access-6jccc\") pod \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\" (UID: \"1a61a032-3087-4a90-ba9e-bb6689d8ef08\") " Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.275014 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-utilities" (OuterVolumeSpecName: "utilities") pod "1a61a032-3087-4a90-ba9e-bb6689d8ef08" (UID: "1a61a032-3087-4a90-ba9e-bb6689d8ef08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.279686 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a61a032-3087-4a90-ba9e-bb6689d8ef08-kube-api-access-6jccc" (OuterVolumeSpecName: "kube-api-access-6jccc") pod "1a61a032-3087-4a90-ba9e-bb6689d8ef08" (UID: "1a61a032-3087-4a90-ba9e-bb6689d8ef08"). InnerVolumeSpecName "kube-api-access-6jccc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.377285 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.377339 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jccc\" (UniqueName: \"kubernetes.io/projected/1a61a032-3087-4a90-ba9e-bb6689d8ef08-kube-api-access-6jccc\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.387418 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a61a032-3087-4a90-ba9e-bb6689d8ef08" (UID: "1a61a032-3087-4a90-ba9e-bb6689d8ef08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.479794 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a61a032-3087-4a90-ba9e-bb6689d8ef08-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.715695 4911 generic.go:334] "Generic (PLEG): container finished" podID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerID="abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3" exitCode=0 Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.715926 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92bdn" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.715939 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92bdn" event={"ID":"1a61a032-3087-4a90-ba9e-bb6689d8ef08","Type":"ContainerDied","Data":"abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3"} Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.716023 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92bdn" event={"ID":"1a61a032-3087-4a90-ba9e-bb6689d8ef08","Type":"ContainerDied","Data":"4a01d9841219ee40ed010a564abd8c6fa7022fac086a3aed0a663b3ff5f2a8a0"} Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.716054 4911 scope.go:117] "RemoveContainer" containerID="abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.748019 4911 scope.go:117] "RemoveContainer" containerID="a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.760032 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92bdn"] Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.769561 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-92bdn"] Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.776827 4911 scope.go:117] "RemoveContainer" containerID="3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.833189 4911 scope.go:117] "RemoveContainer" containerID="abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3" Mar 10 14:36:23 crc kubenswrapper[4911]: E0310 14:36:23.833675 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3\": container with ID starting with abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3 not found: ID does not exist" containerID="abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.833708 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3"} err="failed to get container status \"abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3\": rpc error: code = NotFound desc = could not find container \"abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3\": container with ID starting with abf2c9921ae45ea1e85fc5659106d03e7d3e9a6db21a824f2e2ddbcb3e4830b3 not found: ID does not exist" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.833754 4911 scope.go:117] "RemoveContainer" containerID="a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8" Mar 10 14:36:23 crc kubenswrapper[4911]: E0310 14:36:23.834153 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8\": container with ID starting with a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8 not found: ID does not exist" containerID="a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.834178 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8"} err="failed to get container status \"a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8\": rpc error: code = NotFound desc = could not find container \"a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8\": container with ID starting with a313f64b772d721bb2ed6a50fa979ba95815c97348578c25c3e7fbd9c47933a8 not found: ID does not exist" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.834192 4911 scope.go:117] "RemoveContainer" containerID="3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c" Mar 10 14:36:23 crc kubenswrapper[4911]: E0310 14:36:23.834627 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c\": container with ID starting with 3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c not found: ID does not exist" containerID="3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c" Mar 10 14:36:23 crc kubenswrapper[4911]: I0310 14:36:23.834656 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c"} err="failed to get container status \"3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c\": rpc error: code = NotFound desc = could not find container \"3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c\": container with ID starting with 3a6d8902bf16de03dded8b91ece67822eeefb64bac82af03ae42c9642563578c not found: ID does not exist" Mar 10 14:36:24 crc kubenswrapper[4911]: I0310 14:36:24.206197 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" path="/var/lib/kubelet/pods/1a61a032-3087-4a90-ba9e-bb6689d8ef08/volumes" Mar 10 14:36:26 crc kubenswrapper[4911]: I0310 14:36:26.059111 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cpd4g"] Mar 10 14:36:26 crc kubenswrapper[4911]: I0310 14:36:26.069023 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p89dc"] Mar 10 14:36:26 crc kubenswrapper[4911]: I0310 14:36:26.079075 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cpd4g"] Mar 10 14:36:26 crc kubenswrapper[4911]: I0310 14:36:26.088072 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p89dc"] Mar 10 14:36:26 crc kubenswrapper[4911]: I0310 14:36:26.211866 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d266d46-03f6-4f16-bf8e-fc44da521e64" path="/var/lib/kubelet/pods/1d266d46-03f6-4f16-bf8e-fc44da521e64/volumes" Mar 10 14:36:26 crc kubenswrapper[4911]: I0310 14:36:26.213253 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6554330-1024-4d85-8b5b-f7f354c9631d" path="/var/lib/kubelet/pods/b6554330-1024-4d85-8b5b-f7f354c9631d/volumes" Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.521386 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.522319 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.522403 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.523667 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cb4b818a888d5417b0612f19c0270a3d874b51a5ee2051ea9d2db487bebc236"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.523794 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://4cb4b818a888d5417b0612f19c0270a3d874b51a5ee2051ea9d2db487bebc236" gracePeriod=600 Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.965756 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="4cb4b818a888d5417b0612f19c0270a3d874b51a5ee2051ea9d2db487bebc236" exitCode=0 Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.965766 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"4cb4b818a888d5417b0612f19c0270a3d874b51a5ee2051ea9d2db487bebc236"} Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.966257 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233"} Mar 10 14:36:48 crc kubenswrapper[4911]: I0310 14:36:48.966294 4911 scope.go:117] "RemoveContainer" containerID="560c21ffbea17fa7f1d3a3031fe29ca6be7ba4167dfe883623b5757995403e7c" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.361259 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hzxnm"] Mar 10 14:36:49 crc kubenswrapper[4911]: E0310 14:36:49.362792 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="extract-utilities" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.362816 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="extract-utilities" Mar 10 14:36:49 crc kubenswrapper[4911]: E0310 14:36:49.362829 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="registry-server" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.362836 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="registry-server" Mar 10 14:36:49 crc kubenswrapper[4911]: E0310 14:36:49.362857 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="extract-content" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.362863 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="extract-content" Mar 10 14:36:49 crc kubenswrapper[4911]: E0310 14:36:49.362894 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9dba33-a3f9-4a30-9c5a-8e9871a09592" containerName="oc" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.362900 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9dba33-a3f9-4a30-9c5a-8e9871a09592" containerName="oc" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.363083 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a61a032-3087-4a90-ba9e-bb6689d8ef08" containerName="registry-server" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.363111 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9dba33-a3f9-4a30-9c5a-8e9871a09592" containerName="oc" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.364576 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.393139 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzxnm"] Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.526527 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-utilities\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.526589 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8n69\" (UniqueName: \"kubernetes.io/projected/0867bc20-6bce-4f3e-9c03-e61e215c4102-kube-api-access-h8n69\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.526666 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-catalog-content\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.628351 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-utilities\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.628862 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8n69\" (UniqueName: \"kubernetes.io/projected/0867bc20-6bce-4f3e-9c03-e61e215c4102-kube-api-access-h8n69\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.628815 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-utilities\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.628949 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-catalog-content\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.629509 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-catalog-content\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.653077 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8n69\" (UniqueName: \"kubernetes.io/projected/0867bc20-6bce-4f3e-9c03-e61e215c4102-kube-api-access-h8n69\") pod \"redhat-marketplace-hzxnm\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.691304 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.995075 4911 generic.go:334] "Generic (PLEG): container finished" podID="db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2" containerID="45d075b84e30c3a46fbe97c542fb58536416ab0e14726d56cff58867bb81d105" exitCode=0 Mar 10 14:36:49 crc kubenswrapper[4911]: I0310 14:36:49.995469 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" event={"ID":"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2","Type":"ContainerDied","Data":"45d075b84e30c3a46fbe97c542fb58536416ab0e14726d56cff58867bb81d105"} Mar 10 14:36:50 crc kubenswrapper[4911]: I0310 14:36:50.230225 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzxnm"] Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.007122 4911 generic.go:334] "Generic (PLEG): container finished" podID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerID="a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7" exitCode=0 Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.007345 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzxnm" event={"ID":"0867bc20-6bce-4f3e-9c03-e61e215c4102","Type":"ContainerDied","Data":"a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7"} Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.008936 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzxnm" event={"ID":"0867bc20-6bce-4f3e-9c03-e61e215c4102","Type":"ContainerStarted","Data":"1ba7507cbd46265be8f6a01cdebafdb72fe69dd8c7f44d955f5fdd8e993fd5c0"} Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.422998 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.575012 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-ssh-key-openstack-edpm-ipam\") pod \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.575065 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-inventory\") pod \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.575261 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtdfw\" (UniqueName: \"kubernetes.io/projected/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-kube-api-access-jtdfw\") pod \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\" (UID: \"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2\") " Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.585889 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-kube-api-access-jtdfw" (OuterVolumeSpecName: "kube-api-access-jtdfw") pod "db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2" (UID: "db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2"). InnerVolumeSpecName "kube-api-access-jtdfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.603628 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2" (UID: "db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.603804 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-inventory" (OuterVolumeSpecName: "inventory") pod "db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2" (UID: "db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.678895 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtdfw\" (UniqueName: \"kubernetes.io/projected/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-kube-api-access-jtdfw\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.679120 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:51 crc kubenswrapper[4911]: I0310 14:36:51.679135 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.019744 4911 generic.go:334] "Generic (PLEG): container finished" podID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerID="35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4" exitCode=0 Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.019876 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzxnm" event={"ID":"0867bc20-6bce-4f3e-9c03-e61e215c4102","Type":"ContainerDied","Data":"35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4"} Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.024922 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" event={"ID":"db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2","Type":"ContainerDied","Data":"1cc5bb930510ab0d8128edeb9bbf08f4a6671942f81606f099a3dfd6d335f38a"} Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.024959 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cc5bb930510ab0d8128edeb9bbf08f4a6671942f81606f099a3dfd6d335f38a" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.025018 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.116296 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qhk66"] Mar 10 14:36:52 crc kubenswrapper[4911]: E0310 14:36:52.116897 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.116917 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.117093 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.117862 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.120818 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.120992 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.121040 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.121152 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.129486 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qhk66"] Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.190423 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.190543 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq4vf\" (UniqueName: \"kubernetes.io/projected/02e6e27d-b387-4fa4-993a-525b581993c1-kube-api-access-vq4vf\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.190607 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.294170 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq4vf\" (UniqueName: \"kubernetes.io/projected/02e6e27d-b387-4fa4-993a-525b581993c1-kube-api-access-vq4vf\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.294841 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.296568 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.302460 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.302540 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.319154 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq4vf\" (UniqueName: \"kubernetes.io/projected/02e6e27d-b387-4fa4-993a-525b581993c1-kube-api-access-vq4vf\") pod \"ssh-known-hosts-edpm-deployment-qhk66\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:52 crc kubenswrapper[4911]: I0310 14:36:52.442095 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:36:53 crc kubenswrapper[4911]: I0310 14:36:53.012294 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qhk66"] Mar 10 14:36:53 crc kubenswrapper[4911]: W0310 14:36:53.016650 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e6e27d_b387_4fa4_993a_525b581993c1.slice/crio-fb106c4b7f9a16eb5c4dcd6b28bddd43d8688940d3d18fd844aa942867b62b83 WatchSource:0}: Error finding container fb106c4b7f9a16eb5c4dcd6b28bddd43d8688940d3d18fd844aa942867b62b83: Status 404 returned error can't find the container with id fb106c4b7f9a16eb5c4dcd6b28bddd43d8688940d3d18fd844aa942867b62b83 Mar 10 14:36:53 crc kubenswrapper[4911]: I0310 14:36:53.035559 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzxnm" event={"ID":"0867bc20-6bce-4f3e-9c03-e61e215c4102","Type":"ContainerStarted","Data":"8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a"} Mar 10 14:36:53 crc kubenswrapper[4911]: I0310 14:36:53.038290 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" event={"ID":"02e6e27d-b387-4fa4-993a-525b581993c1","Type":"ContainerStarted","Data":"fb106c4b7f9a16eb5c4dcd6b28bddd43d8688940d3d18fd844aa942867b62b83"} Mar 10 14:36:53 crc kubenswrapper[4911]: I0310 14:36:53.062679 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hzxnm" podStartSLOduration=2.5693015150000003 podStartE2EDuration="4.062629799s" podCreationTimestamp="2026-03-10 14:36:49 +0000 UTC" firstStartedPulling="2026-03-10 14:36:51.010794452 +0000 UTC m=+2115.574314369" lastFinishedPulling="2026-03-10 14:36:52.504122736 +0000 UTC m=+2117.067642653" observedRunningTime="2026-03-10 14:36:53.054464938 +0000 UTC m=+2117.617984865" watchObservedRunningTime="2026-03-10 14:36:53.062629799 +0000 UTC m=+2117.626149726" Mar 10 14:36:54 crc kubenswrapper[4911]: I0310 14:36:54.063970 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" event={"ID":"02e6e27d-b387-4fa4-993a-525b581993c1","Type":"ContainerStarted","Data":"4987f465091f8ec427268a5057039a15ff82ba09731c44f6b1f2536989caa711"} Mar 10 14:36:54 crc kubenswrapper[4911]: I0310 14:36:54.083436 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" podStartSLOduration=1.907418743 podStartE2EDuration="2.083412158s" podCreationTimestamp="2026-03-10 14:36:52 +0000 UTC" firstStartedPulling="2026-03-10 14:36:53.01908629 +0000 UTC m=+2117.582606207" lastFinishedPulling="2026-03-10 14:36:53.195079715 +0000 UTC m=+2117.758599622" observedRunningTime="2026-03-10 14:36:54.077841407 +0000 UTC m=+2118.641361324" watchObservedRunningTime="2026-03-10 14:36:54.083412158 +0000 UTC m=+2118.646932075" Mar 10 14:36:55 crc kubenswrapper[4911]: I0310 14:36:55.624778 4911 scope.go:117] "RemoveContainer" containerID="ad27ddd0f31c25b02ce6d5636c783c77dfb05417e3613d6a844c9ff4b6b454b6" Mar 10 14:36:55 crc kubenswrapper[4911]: I0310 14:36:55.685058 4911 scope.go:117] "RemoveContainer" containerID="d19318cfdd4d30c2259ddec6e16cededfb0e4291ee74cebe9fca767f6c2364b2" Mar 10 14:36:55 crc kubenswrapper[4911]: I0310 14:36:55.731031 4911 scope.go:117] "RemoveContainer" containerID="8f2346e6d23408706e1c7d4a3f0b3e851254f53cd8e7e0f9bc0d66f1da3100bf" Mar 10 14:36:59 crc kubenswrapper[4911]: I0310 14:36:59.692388 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:59 crc kubenswrapper[4911]: I0310 14:36:59.693082 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:36:59 crc kubenswrapper[4911]: I0310 14:36:59.751383 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:37:00 crc kubenswrapper[4911]: I0310 14:37:00.118766 4911 generic.go:334] "Generic (PLEG): container finished" podID="02e6e27d-b387-4fa4-993a-525b581993c1" containerID="4987f465091f8ec427268a5057039a15ff82ba09731c44f6b1f2536989caa711" exitCode=0 Mar 10 14:37:00 crc kubenswrapper[4911]: I0310 14:37:00.119901 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" event={"ID":"02e6e27d-b387-4fa4-993a-525b581993c1","Type":"ContainerDied","Data":"4987f465091f8ec427268a5057039a15ff82ba09731c44f6b1f2536989caa711"} Mar 10 14:37:00 crc kubenswrapper[4911]: I0310 14:37:00.174333 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:37:00 crc kubenswrapper[4911]: I0310 14:37:00.748865 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzxnm"] Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.530418 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.600166 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-ssh-key-openstack-edpm-ipam\") pod \"02e6e27d-b387-4fa4-993a-525b581993c1\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.600472 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-inventory-0\") pod \"02e6e27d-b387-4fa4-993a-525b581993c1\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.600657 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq4vf\" (UniqueName: \"kubernetes.io/projected/02e6e27d-b387-4fa4-993a-525b581993c1-kube-api-access-vq4vf\") pod \"02e6e27d-b387-4fa4-993a-525b581993c1\" (UID: \"02e6e27d-b387-4fa4-993a-525b581993c1\") " Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.605950 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e6e27d-b387-4fa4-993a-525b581993c1-kube-api-access-vq4vf" (OuterVolumeSpecName: "kube-api-access-vq4vf") pod "02e6e27d-b387-4fa4-993a-525b581993c1" (UID: "02e6e27d-b387-4fa4-993a-525b581993c1"). InnerVolumeSpecName "kube-api-access-vq4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.630347 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "02e6e27d-b387-4fa4-993a-525b581993c1" (UID: "02e6e27d-b387-4fa4-993a-525b581993c1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.632889 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "02e6e27d-b387-4fa4-993a-525b581993c1" (UID: "02e6e27d-b387-4fa4-993a-525b581993c1"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.703025 4911 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.703411 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq4vf\" (UniqueName: \"kubernetes.io/projected/02e6e27d-b387-4fa4-993a-525b581993c1-kube-api-access-vq4vf\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:01 crc kubenswrapper[4911]: I0310 14:37:01.703511 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6e27d-b387-4fa4-993a-525b581993c1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.140804 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.140787 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qhk66" event={"ID":"02e6e27d-b387-4fa4-993a-525b581993c1","Type":"ContainerDied","Data":"fb106c4b7f9a16eb5c4dcd6b28bddd43d8688940d3d18fd844aa942867b62b83"} Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.140863 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb106c4b7f9a16eb5c4dcd6b28bddd43d8688940d3d18fd844aa942867b62b83" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.140955 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hzxnm" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerName="registry-server" containerID="cri-o://8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a" gracePeriod=2 Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.269095 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n"] Mar 10 14:37:02 crc kubenswrapper[4911]: E0310 14:37:02.269561 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e6e27d-b387-4fa4-993a-525b581993c1" containerName="ssh-known-hosts-edpm-deployment" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.269584 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e6e27d-b387-4fa4-993a-525b581993c1" containerName="ssh-known-hosts-edpm-deployment" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.269840 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e6e27d-b387-4fa4-993a-525b581993c1" containerName="ssh-known-hosts-edpm-deployment" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.270595 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.276808 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.276853 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.277257 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.278026 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.338022 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n"] Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.432025 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.432211 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.432330 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqj8\" (UniqueName: \"kubernetes.io/projected/8c81ff0d-aedd-419d-b159-b2e36b895839-kube-api-access-fwqj8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.535670 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.536541 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.537041 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqj8\" (UniqueName: \"kubernetes.io/projected/8c81ff0d-aedd-419d-b159-b2e36b895839-kube-api-access-fwqj8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.542019 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.544550 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.556097 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqj8\" (UniqueName: \"kubernetes.io/projected/8c81ff0d-aedd-419d-b159-b2e36b895839-kube-api-access-fwqj8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wcn6n\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.658734 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.659930 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.742749 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-catalog-content\") pod \"0867bc20-6bce-4f3e-9c03-e61e215c4102\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.742846 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8n69\" (UniqueName: \"kubernetes.io/projected/0867bc20-6bce-4f3e-9c03-e61e215c4102-kube-api-access-h8n69\") pod \"0867bc20-6bce-4f3e-9c03-e61e215c4102\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.743001 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-utilities\") pod \"0867bc20-6bce-4f3e-9c03-e61e215c4102\" (UID: \"0867bc20-6bce-4f3e-9c03-e61e215c4102\") " Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.744743 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-utilities" (OuterVolumeSpecName: "utilities") pod "0867bc20-6bce-4f3e-9c03-e61e215c4102" (UID: "0867bc20-6bce-4f3e-9c03-e61e215c4102"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.752938 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0867bc20-6bce-4f3e-9c03-e61e215c4102-kube-api-access-h8n69" (OuterVolumeSpecName: "kube-api-access-h8n69") pod "0867bc20-6bce-4f3e-9c03-e61e215c4102" (UID: "0867bc20-6bce-4f3e-9c03-e61e215c4102"). InnerVolumeSpecName "kube-api-access-h8n69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.783544 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0867bc20-6bce-4f3e-9c03-e61e215c4102" (UID: "0867bc20-6bce-4f3e-9c03-e61e215c4102"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.845755 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.845787 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8n69\" (UniqueName: \"kubernetes.io/projected/0867bc20-6bce-4f3e-9c03-e61e215c4102-kube-api-access-h8n69\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:02 crc kubenswrapper[4911]: I0310 14:37:02.845798 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0867bc20-6bce-4f3e-9c03-e61e215c4102-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.156690 4911 generic.go:334] "Generic (PLEG): container finished" podID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerID="8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a" exitCode=0 Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.156764 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzxnm" event={"ID":"0867bc20-6bce-4f3e-9c03-e61e215c4102","Type":"ContainerDied","Data":"8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a"} Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.156800 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzxnm" event={"ID":"0867bc20-6bce-4f3e-9c03-e61e215c4102","Type":"ContainerDied","Data":"1ba7507cbd46265be8f6a01cdebafdb72fe69dd8c7f44d955f5fdd8e993fd5c0"} Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.156820 4911 scope.go:117] "RemoveContainer" containerID="8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.157022 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzxnm" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.190755 4911 scope.go:117] "RemoveContainer" containerID="35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.217483 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzxnm"] Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.227571 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzxnm"] Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.232104 4911 scope.go:117] "RemoveContainer" containerID="a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.237077 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n"] Mar 10 14:37:03 crc kubenswrapper[4911]: W0310 14:37:03.240028 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c81ff0d_aedd_419d_b159_b2e36b895839.slice/crio-41d5ff3b489723090d75ac14f793656f6f449e1663a1ce554ccd8b8860294865 WatchSource:0}: Error finding container 41d5ff3b489723090d75ac14f793656f6f449e1663a1ce554ccd8b8860294865: Status 404 returned error can't find the container with id 41d5ff3b489723090d75ac14f793656f6f449e1663a1ce554ccd8b8860294865 Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.257112 4911 scope.go:117] "RemoveContainer" containerID="8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a" Mar 10 14:37:03 crc kubenswrapper[4911]: E0310 14:37:03.257777 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a\": container with ID starting with 8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a not found: ID does not exist" containerID="8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.257833 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a"} err="failed to get container status \"8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a\": rpc error: code = NotFound desc = could not find container \"8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a\": container with ID starting with 8cdaf805f3cc216879deaeb09fa839c9d08d524a0010afbadeba1cc09f3aa45a not found: ID does not exist" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.257866 4911 scope.go:117] "RemoveContainer" containerID="35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4" Mar 10 14:37:03 crc kubenswrapper[4911]: E0310 14:37:03.258562 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4\": container with ID starting with 35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4 not found: ID does not exist" containerID="35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.258603 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4"} err="failed to get container status \"35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4\": rpc error: code = NotFound desc = could not find container \"35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4\": container with ID starting with 35270d9484a403a1df8eaa42f8b7f1d4c53e8d7a858c49c3af66e98e335a08e4 not found: ID does not exist" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.258629 4911 scope.go:117] "RemoveContainer" containerID="a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7" Mar 10 14:37:03 crc kubenswrapper[4911]: E0310 14:37:03.258953 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7\": container with ID starting with a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7 not found: ID does not exist" containerID="a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7" Mar 10 14:37:03 crc kubenswrapper[4911]: I0310 14:37:03.258974 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7"} err="failed to get container status \"a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7\": rpc error: code = NotFound desc = could not find container \"a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7\": container with ID starting with a5560a4ee417ca270d8b5eee7c22f6bf70f144ca87aafda1e9141470f70c2fc7 not found: ID does not exist" Mar 10 14:37:04 crc kubenswrapper[4911]: I0310 14:37:04.167130 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" event={"ID":"8c81ff0d-aedd-419d-b159-b2e36b895839","Type":"ContainerStarted","Data":"ac71edc41ff0d657bd86f25c26f786536a0f415977e9edfe67929ecd2d643edb"} Mar 10 14:37:04 crc kubenswrapper[4911]: I0310 14:37:04.167580 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" event={"ID":"8c81ff0d-aedd-419d-b159-b2e36b895839","Type":"ContainerStarted","Data":"41d5ff3b489723090d75ac14f793656f6f449e1663a1ce554ccd8b8860294865"} Mar 10 14:37:04 crc kubenswrapper[4911]: I0310 14:37:04.185309 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" podStartSLOduration=2.009510428 podStartE2EDuration="2.185285538s" podCreationTimestamp="2026-03-10 14:37:02 +0000 UTC" firstStartedPulling="2026-03-10 14:37:03.246034896 +0000 UTC m=+2127.809554813" lastFinishedPulling="2026-03-10 14:37:03.421810006 +0000 UTC m=+2127.985329923" observedRunningTime="2026-03-10 14:37:04.182885293 +0000 UTC m=+2128.746405210" watchObservedRunningTime="2026-03-10 14:37:04.185285538 +0000 UTC m=+2128.748805455" Mar 10 14:37:04 crc kubenswrapper[4911]: I0310 14:37:04.205553 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" path="/var/lib/kubelet/pods/0867bc20-6bce-4f3e-9c03-e61e215c4102/volumes" Mar 10 14:37:11 crc kubenswrapper[4911]: I0310 14:37:11.044608 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hgssg"] Mar 10 14:37:11 crc kubenswrapper[4911]: I0310 14:37:11.054452 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hgssg"] Mar 10 14:37:11 crc kubenswrapper[4911]: I0310 14:37:11.242886 4911 generic.go:334] "Generic (PLEG): container finished" podID="8c81ff0d-aedd-419d-b159-b2e36b895839" containerID="ac71edc41ff0d657bd86f25c26f786536a0f415977e9edfe67929ecd2d643edb" exitCode=0 Mar 10 14:37:11 crc kubenswrapper[4911]: I0310 14:37:11.242934 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" event={"ID":"8c81ff0d-aedd-419d-b159-b2e36b895839","Type":"ContainerDied","Data":"ac71edc41ff0d657bd86f25c26f786536a0f415977e9edfe67929ecd2d643edb"} Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.205074 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d485c296-ac7f-4d09-ad90-470f8b608207" path="/var/lib/kubelet/pods/d485c296-ac7f-4d09-ad90-470f8b608207/volumes" Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.676703 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.762670 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-inventory\") pod \"8c81ff0d-aedd-419d-b159-b2e36b895839\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.762926 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwqj8\" (UniqueName: \"kubernetes.io/projected/8c81ff0d-aedd-419d-b159-b2e36b895839-kube-api-access-fwqj8\") pod \"8c81ff0d-aedd-419d-b159-b2e36b895839\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.763068 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-ssh-key-openstack-edpm-ipam\") pod \"8c81ff0d-aedd-419d-b159-b2e36b895839\" (UID: \"8c81ff0d-aedd-419d-b159-b2e36b895839\") " Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.786789 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c81ff0d-aedd-419d-b159-b2e36b895839-kube-api-access-fwqj8" (OuterVolumeSpecName: "kube-api-access-fwqj8") pod "8c81ff0d-aedd-419d-b159-b2e36b895839" (UID: "8c81ff0d-aedd-419d-b159-b2e36b895839"). InnerVolumeSpecName "kube-api-access-fwqj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.793313 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-inventory" (OuterVolumeSpecName: "inventory") pod "8c81ff0d-aedd-419d-b159-b2e36b895839" (UID: "8c81ff0d-aedd-419d-b159-b2e36b895839"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.820487 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c81ff0d-aedd-419d-b159-b2e36b895839" (UID: "8c81ff0d-aedd-419d-b159-b2e36b895839"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.865955 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwqj8\" (UniqueName: \"kubernetes.io/projected/8c81ff0d-aedd-419d-b159-b2e36b895839-kube-api-access-fwqj8\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.865999 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:12 crc kubenswrapper[4911]: I0310 14:37:12.866015 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c81ff0d-aedd-419d-b159-b2e36b895839-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.264083 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" event={"ID":"8c81ff0d-aedd-419d-b159-b2e36b895839","Type":"ContainerDied","Data":"41d5ff3b489723090d75ac14f793656f6f449e1663a1ce554ccd8b8860294865"} Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.264132 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41d5ff3b489723090d75ac14f793656f6f449e1663a1ce554ccd8b8860294865" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.264249 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wcn6n" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.431177 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t"] Mar 10 14:37:13 crc kubenswrapper[4911]: E0310 14:37:13.432010 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerName="registry-server" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.432036 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerName="registry-server" Mar 10 14:37:13 crc kubenswrapper[4911]: E0310 14:37:13.432059 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c81ff0d-aedd-419d-b159-b2e36b895839" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.432067 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c81ff0d-aedd-419d-b159-b2e36b895839" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:37:13 crc kubenswrapper[4911]: E0310 14:37:13.432085 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerName="extract-content" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.432092 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerName="extract-content" Mar 10 14:37:13 crc kubenswrapper[4911]: E0310 14:37:13.432111 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerName="extract-utilities" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.432120 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerName="extract-utilities" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.432340 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0867bc20-6bce-4f3e-9c03-e61e215c4102" containerName="registry-server" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.432386 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c81ff0d-aedd-419d-b159-b2e36b895839" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.433295 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.436680 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.436960 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.437194 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.439051 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.460305 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t"] Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.581491 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.581552 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmddh\" (UniqueName: \"kubernetes.io/projected/7b847208-7241-442f-8b60-b153986d1ea3-kube-api-access-fmddh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.581601 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.683176 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.683236 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmddh\" (UniqueName: \"kubernetes.io/projected/7b847208-7241-442f-8b60-b153986d1ea3-kube-api-access-fmddh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.683309 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.694699 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.696981 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.706804 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmddh\" (UniqueName: \"kubernetes.io/projected/7b847208-7241-442f-8b60-b153986d1ea3-kube-api-access-fmddh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:13 crc kubenswrapper[4911]: I0310 14:37:13.759866 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:14 crc kubenswrapper[4911]: I0310 14:37:14.403845 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t"] Mar 10 14:37:15 crc kubenswrapper[4911]: I0310 14:37:15.308348 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" event={"ID":"7b847208-7241-442f-8b60-b153986d1ea3","Type":"ContainerStarted","Data":"bd77b95d0680be4ab4ff658822ce40ebc6b3232d2ae03a0b12eeb8c7e43403de"} Mar 10 14:37:15 crc kubenswrapper[4911]: I0310 14:37:15.309863 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" event={"ID":"7b847208-7241-442f-8b60-b153986d1ea3","Type":"ContainerStarted","Data":"1bdd7284d9e52ee9e28b1688d0957764db5a96f4f9e823c5b1652552231761be"} Mar 10 14:37:15 crc kubenswrapper[4911]: I0310 14:37:15.337577 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" podStartSLOduration=2.192712528 podStartE2EDuration="2.337550519s" podCreationTimestamp="2026-03-10 14:37:13 +0000 UTC" firstStartedPulling="2026-03-10 14:37:14.410093327 +0000 UTC m=+2138.973613244" lastFinishedPulling="2026-03-10 14:37:14.554931318 +0000 UTC m=+2139.118451235" observedRunningTime="2026-03-10 14:37:15.330618601 +0000 UTC m=+2139.894138518" watchObservedRunningTime="2026-03-10 14:37:15.337550519 +0000 UTC m=+2139.901070436" Mar 10 14:37:24 crc kubenswrapper[4911]: I0310 14:37:24.408912 4911 generic.go:334] "Generic (PLEG): container finished" podID="7b847208-7241-442f-8b60-b153986d1ea3" containerID="bd77b95d0680be4ab4ff658822ce40ebc6b3232d2ae03a0b12eeb8c7e43403de" exitCode=0 Mar 10 14:37:24 crc kubenswrapper[4911]: I0310 14:37:24.409002 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" event={"ID":"7b847208-7241-442f-8b60-b153986d1ea3","Type":"ContainerDied","Data":"bd77b95d0680be4ab4ff658822ce40ebc6b3232d2ae03a0b12eeb8c7e43403de"} Mar 10 14:37:25 crc kubenswrapper[4911]: I0310 14:37:25.976797 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.076536 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-ssh-key-openstack-edpm-ipam\") pod \"7b847208-7241-442f-8b60-b153986d1ea3\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.076619 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-inventory\") pod \"7b847208-7241-442f-8b60-b153986d1ea3\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.076782 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmddh\" (UniqueName: \"kubernetes.io/projected/7b847208-7241-442f-8b60-b153986d1ea3-kube-api-access-fmddh\") pod \"7b847208-7241-442f-8b60-b153986d1ea3\" (UID: \"7b847208-7241-442f-8b60-b153986d1ea3\") " Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.082806 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b847208-7241-442f-8b60-b153986d1ea3-kube-api-access-fmddh" (OuterVolumeSpecName: "kube-api-access-fmddh") pod "7b847208-7241-442f-8b60-b153986d1ea3" (UID: "7b847208-7241-442f-8b60-b153986d1ea3"). InnerVolumeSpecName "kube-api-access-fmddh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.104694 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7b847208-7241-442f-8b60-b153986d1ea3" (UID: "7b847208-7241-442f-8b60-b153986d1ea3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.105471 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-inventory" (OuterVolumeSpecName: "inventory") pod "7b847208-7241-442f-8b60-b153986d1ea3" (UID: "7b847208-7241-442f-8b60-b153986d1ea3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.179368 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.179752 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b847208-7241-442f-8b60-b153986d1ea3-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.179767 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmddh\" (UniqueName: \"kubernetes.io/projected/7b847208-7241-442f-8b60-b153986d1ea3-kube-api-access-fmddh\") on node \"crc\" DevicePath \"\"" Mar 10 14:37:26 crc kubenswrapper[4911]: E0310 14:37:26.343698 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b847208_7241_442f_8b60_b153986d1ea3.slice\": RecentStats: unable to find data in memory cache]" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.431695 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" event={"ID":"7b847208-7241-442f-8b60-b153986d1ea3","Type":"ContainerDied","Data":"1bdd7284d9e52ee9e28b1688d0957764db5a96f4f9e823c5b1652552231761be"} Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.431767 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bdd7284d9e52ee9e28b1688d0957764db5a96f4f9e823c5b1652552231761be" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.431863 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.518008 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr"] Mar 10 14:37:26 crc kubenswrapper[4911]: E0310 14:37:26.518567 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b847208-7241-442f-8b60-b153986d1ea3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.518586 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b847208-7241-442f-8b60-b153986d1ea3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.518839 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b847208-7241-442f-8b60-b153986d1ea3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.519771 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.526978 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.527310 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.527519 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.528922 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.529005 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.529050 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.530898 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.531316 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.544965 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr"] Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.592589 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.592664 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.592784 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b87jz\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-kube-api-access-b87jz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.592848 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.592903 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.592939 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.593107 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.593252 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.593349 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.593454 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.593649 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.593747 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.593828 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.593889 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696073 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696155 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696189 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696233 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696286 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696324 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b87jz\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-kube-api-access-b87jz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696367 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696417 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696444 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696484 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696522 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696559 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696581 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.696623 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.702318 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.702596 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.703130 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.703252 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.703558 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.703948 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.704113 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.704142 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.705252 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.705258 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.705483 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.706079 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.708699 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.716587 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b87jz\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-kube-api-access-b87jz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:26 crc kubenswrapper[4911]: I0310 14:37:26.839562 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:37:27 crc kubenswrapper[4911]: I0310 14:37:27.394960 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr"] Mar 10 14:37:27 crc kubenswrapper[4911]: I0310 14:37:27.442795 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" event={"ID":"5d1a5e0b-536c-4d5f-9c65-595361611fcd","Type":"ContainerStarted","Data":"ff0334b04c63c8bb19e9c3c2ab83b8a6940cd166d66528465b06d80ed3a8f4a7"} Mar 10 14:37:28 crc kubenswrapper[4911]: I0310 14:37:28.454064 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" event={"ID":"5d1a5e0b-536c-4d5f-9c65-595361611fcd","Type":"ContainerStarted","Data":"0f2444a3ca257bfa11e8af72ee968606aae246097786ab9e4239b963579470a6"} Mar 10 14:37:28 crc kubenswrapper[4911]: I0310 14:37:28.502024 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" podStartSLOduration=2.322304878 podStartE2EDuration="2.502005123s" podCreationTimestamp="2026-03-10 14:37:26 +0000 UTC" firstStartedPulling="2026-03-10 14:37:27.400960241 +0000 UTC m=+2151.964480158" lastFinishedPulling="2026-03-10 14:37:27.580660476 +0000 UTC m=+2152.144180403" observedRunningTime="2026-03-10 14:37:28.500534233 +0000 UTC m=+2153.064054180" watchObservedRunningTime="2026-03-10 14:37:28.502005123 +0000 UTC m=+2153.065525040" Mar 10 14:37:55 crc kubenswrapper[4911]: I0310 14:37:55.868009 4911 scope.go:117] "RemoveContainer" containerID="628cde62ea5a91c5b4ffbc915f3335d589fc2c767f9fb9a8faeaeb3dbd805596" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.150652 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552558-n9c5z"] Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.154390 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.159032 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.159243 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.159391 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.161840 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552558-n9c5z"] Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.265178 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx5hc\" (UniqueName: \"kubernetes.io/projected/3f99285f-2c0b-4a1a-920b-1daa90b49b19-kube-api-access-lx5hc\") pod \"auto-csr-approver-29552558-n9c5z\" (UID: \"3f99285f-2c0b-4a1a-920b-1daa90b49b19\") " pod="openshift-infra/auto-csr-approver-29552558-n9c5z" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.367406 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx5hc\" (UniqueName: \"kubernetes.io/projected/3f99285f-2c0b-4a1a-920b-1daa90b49b19-kube-api-access-lx5hc\") pod \"auto-csr-approver-29552558-n9c5z\" (UID: \"3f99285f-2c0b-4a1a-920b-1daa90b49b19\") " pod="openshift-infra/auto-csr-approver-29552558-n9c5z" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.391530 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx5hc\" (UniqueName: \"kubernetes.io/projected/3f99285f-2c0b-4a1a-920b-1daa90b49b19-kube-api-access-lx5hc\") pod \"auto-csr-approver-29552558-n9c5z\" (UID: \"3f99285f-2c0b-4a1a-920b-1daa90b49b19\") " pod="openshift-infra/auto-csr-approver-29552558-n9c5z" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.480792 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.916900 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552558-n9c5z"] Mar 10 14:38:00 crc kubenswrapper[4911]: I0310 14:38:00.922357 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:38:01 crc kubenswrapper[4911]: I0310 14:38:01.813125 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" event={"ID":"3f99285f-2c0b-4a1a-920b-1daa90b49b19","Type":"ContainerStarted","Data":"2ebe6e51f2181c29137c7b7f6af14a46ed8b25a1ce40c62342136a424546b945"} Mar 10 14:38:02 crc kubenswrapper[4911]: I0310 14:38:02.824711 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" event={"ID":"3f99285f-2c0b-4a1a-920b-1daa90b49b19","Type":"ContainerStarted","Data":"7f21fef8a6e95af340b78a09afc4026e5c47b882f3368268a824f9a355c4ff35"} Mar 10 14:38:02 crc kubenswrapper[4911]: I0310 14:38:02.828573 4911 generic.go:334] "Generic (PLEG): container finished" podID="5d1a5e0b-536c-4d5f-9c65-595361611fcd" containerID="0f2444a3ca257bfa11e8af72ee968606aae246097786ab9e4239b963579470a6" exitCode=0 Mar 10 14:38:02 crc kubenswrapper[4911]: I0310 14:38:02.828617 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" event={"ID":"5d1a5e0b-536c-4d5f-9c65-595361611fcd","Type":"ContainerDied","Data":"0f2444a3ca257bfa11e8af72ee968606aae246097786ab9e4239b963579470a6"} Mar 10 14:38:02 crc kubenswrapper[4911]: I0310 14:38:02.849743 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" podStartSLOduration=1.4791439 podStartE2EDuration="2.849702299s" podCreationTimestamp="2026-03-10 14:38:00 +0000 UTC" firstStartedPulling="2026-03-10 14:38:00.922127408 +0000 UTC m=+2185.485647325" lastFinishedPulling="2026-03-10 14:38:02.292685797 +0000 UTC m=+2186.856205724" observedRunningTime="2026-03-10 14:38:02.839136893 +0000 UTC m=+2187.402656810" watchObservedRunningTime="2026-03-10 14:38:02.849702299 +0000 UTC m=+2187.413222226" Mar 10 14:38:03 crc kubenswrapper[4911]: I0310 14:38:03.841637 4911 generic.go:334] "Generic (PLEG): container finished" podID="3f99285f-2c0b-4a1a-920b-1daa90b49b19" containerID="7f21fef8a6e95af340b78a09afc4026e5c47b882f3368268a824f9a355c4ff35" exitCode=0 Mar 10 14:38:03 crc kubenswrapper[4911]: I0310 14:38:03.841691 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" event={"ID":"3f99285f-2c0b-4a1a-920b-1daa90b49b19","Type":"ContainerDied","Data":"7f21fef8a6e95af340b78a09afc4026e5c47b882f3368268a824f9a355c4ff35"} Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.354560 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.468702 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-nova-combined-ca-bundle\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.468832 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-libvirt-combined-ca-bundle\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.468938 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ssh-key-openstack-edpm-ipam\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.469023 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.469112 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b87jz\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-kube-api-access-b87jz\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.469268 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ovn-combined-ca-bundle\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.469692 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.469833 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-bootstrap-combined-ca-bundle\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.469928 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.469997 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-inventory\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.470075 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-repo-setup-combined-ca-bundle\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.470161 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-telemetry-combined-ca-bundle\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.470191 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-neutron-metadata-combined-ca-bundle\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.470257 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\" (UID: \"5d1a5e0b-536c-4d5f-9c65-595361611fcd\") " Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.479053 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.481114 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.481256 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.481558 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.481768 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.481872 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.481865 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.484419 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.485076 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-kube-api-access-b87jz" (OuterVolumeSpecName: "kube-api-access-b87jz") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "kube-api-access-b87jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.485147 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.486033 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.489975 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.512818 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-inventory" (OuterVolumeSpecName: "inventory") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.514480 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5d1a5e0b-536c-4d5f-9c65-595361611fcd" (UID: "5d1a5e0b-536c-4d5f-9c65-595361611fcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.573996 4911 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574041 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b87jz\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-kube-api-access-b87jz\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574059 4911 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574072 4911 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574087 4911 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574100 4911 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574115 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574127 4911 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574139 4911 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574152 4911 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574165 4911 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d1a5e0b-536c-4d5f-9c65-595361611fcd-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574180 4911 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574191 4911 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.574202 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d1a5e0b-536c-4d5f-9c65-595361611fcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.852608 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.852747 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr" event={"ID":"5d1a5e0b-536c-4d5f-9c65-595361611fcd","Type":"ContainerDied","Data":"ff0334b04c63c8bb19e9c3c2ab83b8a6940cd166d66528465b06d80ed3a8f4a7"} Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.852794 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff0334b04c63c8bb19e9c3c2ab83b8a6940cd166d66528465b06d80ed3a8f4a7" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.964179 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd"] Mar 10 14:38:04 crc kubenswrapper[4911]: E0310 14:38:04.968050 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1a5e0b-536c-4d5f-9c65-595361611fcd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.968085 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1a5e0b-536c-4d5f-9c65-595361611fcd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.968401 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1a5e0b-536c-4d5f-9c65-595361611fcd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.969407 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.972545 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.972664 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.972770 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.972891 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.973135 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 10 14:38:04 crc kubenswrapper[4911]: I0310 14:38:04.996329 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd"] Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.083866 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.084365 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.084564 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d4d304b-5bae-475d-9d99-da422d354bb0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.084785 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bhnd\" (UniqueName: \"kubernetes.io/projected/3d4d304b-5bae-475d-9d99-da422d354bb0-kube-api-access-4bhnd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.084849 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.187282 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.187771 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d4d304b-5bae-475d-9d99-da422d354bb0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.187850 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bhnd\" (UniqueName: \"kubernetes.io/projected/3d4d304b-5bae-475d-9d99-da422d354bb0-kube-api-access-4bhnd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.187912 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.187986 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.189199 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d4d304b-5bae-475d-9d99-da422d354bb0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.195223 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.196270 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.196904 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.206289 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bhnd\" (UniqueName: \"kubernetes.io/projected/3d4d304b-5bae-475d-9d99-da422d354bb0-kube-api-access-4bhnd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-prhkd\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.291575 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.292974 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.393514 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx5hc\" (UniqueName: \"kubernetes.io/projected/3f99285f-2c0b-4a1a-920b-1daa90b49b19-kube-api-access-lx5hc\") pod \"3f99285f-2c0b-4a1a-920b-1daa90b49b19\" (UID: \"3f99285f-2c0b-4a1a-920b-1daa90b49b19\") " Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.403293 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f99285f-2c0b-4a1a-920b-1daa90b49b19-kube-api-access-lx5hc" (OuterVolumeSpecName: "kube-api-access-lx5hc") pod "3f99285f-2c0b-4a1a-920b-1daa90b49b19" (UID: "3f99285f-2c0b-4a1a-920b-1daa90b49b19"). InnerVolumeSpecName "kube-api-access-lx5hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.497037 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx5hc\" (UniqueName: \"kubernetes.io/projected/3f99285f-2c0b-4a1a-920b-1daa90b49b19-kube-api-access-lx5hc\") on node \"crc\" DevicePath \"\"" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.848057 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd"] Mar 10 14:38:05 crc kubenswrapper[4911]: W0310 14:38:05.853016 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d4d304b_5bae_475d_9d99_da422d354bb0.slice/crio-66e6ec17db8d833bd40c8771a9adf002e3a61d956f5866fe6c84940078c6848f WatchSource:0}: Error finding container 66e6ec17db8d833bd40c8771a9adf002e3a61d956f5866fe6c84940078c6848f: Status 404 returned error can't find the container with id 66e6ec17db8d833bd40c8771a9adf002e3a61d956f5866fe6c84940078c6848f Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.871714 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" event={"ID":"3f99285f-2c0b-4a1a-920b-1daa90b49b19","Type":"ContainerDied","Data":"2ebe6e51f2181c29137c7b7f6af14a46ed8b25a1ce40c62342136a424546b945"} Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.871797 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ebe6e51f2181c29137c7b7f6af14a46ed8b25a1ce40c62342136a424546b945" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.871888 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552558-n9c5z" Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.874710 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" event={"ID":"3d4d304b-5bae-475d-9d99-da422d354bb0","Type":"ContainerStarted","Data":"66e6ec17db8d833bd40c8771a9adf002e3a61d956f5866fe6c84940078c6848f"} Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.927993 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552552-hclld"] Mar 10 14:38:05 crc kubenswrapper[4911]: I0310 14:38:05.955991 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552552-hclld"] Mar 10 14:38:06 crc kubenswrapper[4911]: I0310 14:38:06.245123 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06250593-6f10-4848-a3d7-dc782bcee227" path="/var/lib/kubelet/pods/06250593-6f10-4848-a3d7-dc782bcee227/volumes" Mar 10 14:38:06 crc kubenswrapper[4911]: I0310 14:38:06.888835 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" event={"ID":"3d4d304b-5bae-475d-9d99-da422d354bb0","Type":"ContainerStarted","Data":"d073cb37b0517b34d24336502a9fbd70e17c24e1576f05bea1a47b2f2a4f6c00"} Mar 10 14:38:06 crc kubenswrapper[4911]: I0310 14:38:06.917490 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" podStartSLOduration=2.6891514880000003 podStartE2EDuration="2.917454689s" podCreationTimestamp="2026-03-10 14:38:04 +0000 UTC" firstStartedPulling="2026-03-10 14:38:05.857094928 +0000 UTC m=+2190.420614855" lastFinishedPulling="2026-03-10 14:38:06.085398139 +0000 UTC m=+2190.648918056" observedRunningTime="2026-03-10 14:38:06.908596019 +0000 UTC m=+2191.472115946" watchObservedRunningTime="2026-03-10 14:38:06.917454689 +0000 UTC m=+2191.480974606" Mar 10 14:38:48 crc kubenswrapper[4911]: I0310 14:38:48.521185 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:38:48 crc kubenswrapper[4911]: I0310 14:38:48.521616 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.082492 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d5q6r"] Mar 10 14:38:49 crc kubenswrapper[4911]: E0310 14:38:49.083078 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f99285f-2c0b-4a1a-920b-1daa90b49b19" containerName="oc" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.083101 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f99285f-2c0b-4a1a-920b-1daa90b49b19" containerName="oc" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.083347 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f99285f-2c0b-4a1a-920b-1daa90b49b19" containerName="oc" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.085166 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.111928 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5q6r"] Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.244064 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p88d\" (UniqueName: \"kubernetes.io/projected/abe372cc-0913-46d0-a2a3-e1f628459e04-kube-api-access-5p88d\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.244402 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-utilities\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.244454 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-catalog-content\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.346417 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-catalog-content\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.347477 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p88d\" (UniqueName: \"kubernetes.io/projected/abe372cc-0913-46d0-a2a3-e1f628459e04-kube-api-access-5p88d\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.347685 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-utilities\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.347794 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-catalog-content\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.348181 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-utilities\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.372189 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p88d\" (UniqueName: \"kubernetes.io/projected/abe372cc-0913-46d0-a2a3-e1f628459e04-kube-api-access-5p88d\") pod \"community-operators-d5q6r\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.410291 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:49 crc kubenswrapper[4911]: W0310 14:38:49.982892 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe372cc_0913_46d0_a2a3_e1f628459e04.slice/crio-cf8e01077d0154a8a630c013415771e83c6b063ac098ad25bc0e9933db2c44bc WatchSource:0}: Error finding container cf8e01077d0154a8a630c013415771e83c6b063ac098ad25bc0e9933db2c44bc: Status 404 returned error can't find the container with id cf8e01077d0154a8a630c013415771e83c6b063ac098ad25bc0e9933db2c44bc Mar 10 14:38:49 crc kubenswrapper[4911]: I0310 14:38:49.984017 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5q6r"] Mar 10 14:38:50 crc kubenswrapper[4911]: I0310 14:38:50.312465 4911 generic.go:334] "Generic (PLEG): container finished" podID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerID="33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15" exitCode=0 Mar 10 14:38:50 crc kubenswrapper[4911]: I0310 14:38:50.312574 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5q6r" event={"ID":"abe372cc-0913-46d0-a2a3-e1f628459e04","Type":"ContainerDied","Data":"33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15"} Mar 10 14:38:50 crc kubenswrapper[4911]: I0310 14:38:50.312891 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5q6r" event={"ID":"abe372cc-0913-46d0-a2a3-e1f628459e04","Type":"ContainerStarted","Data":"cf8e01077d0154a8a630c013415771e83c6b063ac098ad25bc0e9933db2c44bc"} Mar 10 14:38:52 crc kubenswrapper[4911]: I0310 14:38:52.333677 4911 generic.go:334] "Generic (PLEG): container finished" podID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerID="26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484" exitCode=0 Mar 10 14:38:52 crc kubenswrapper[4911]: I0310 14:38:52.333752 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5q6r" event={"ID":"abe372cc-0913-46d0-a2a3-e1f628459e04","Type":"ContainerDied","Data":"26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484"} Mar 10 14:38:53 crc kubenswrapper[4911]: I0310 14:38:53.359383 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5q6r" event={"ID":"abe372cc-0913-46d0-a2a3-e1f628459e04","Type":"ContainerStarted","Data":"dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446"} Mar 10 14:38:53 crc kubenswrapper[4911]: I0310 14:38:53.389623 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d5q6r" podStartSLOduration=1.963828469 podStartE2EDuration="4.389595219s" podCreationTimestamp="2026-03-10 14:38:49 +0000 UTC" firstStartedPulling="2026-03-10 14:38:50.314284492 +0000 UTC m=+2234.877804409" lastFinishedPulling="2026-03-10 14:38:52.740051242 +0000 UTC m=+2237.303571159" observedRunningTime="2026-03-10 14:38:53.382834266 +0000 UTC m=+2237.946354183" watchObservedRunningTime="2026-03-10 14:38:53.389595219 +0000 UTC m=+2237.953115156" Mar 10 14:38:55 crc kubenswrapper[4911]: I0310 14:38:55.980010 4911 scope.go:117] "RemoveContainer" containerID="81f42c0acca3bb9babf6d48a2090da8bc8ad314cd86fed989d4636807fc6cce6" Mar 10 14:38:59 crc kubenswrapper[4911]: I0310 14:38:59.411812 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:59 crc kubenswrapper[4911]: I0310 14:38:59.412433 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:59 crc kubenswrapper[4911]: I0310 14:38:59.463045 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:59 crc kubenswrapper[4911]: I0310 14:38:59.518045 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:38:59 crc kubenswrapper[4911]: I0310 14:38:59.707233 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d5q6r"] Mar 10 14:39:01 crc kubenswrapper[4911]: I0310 14:39:01.458449 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d5q6r" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerName="registry-server" containerID="cri-o://dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446" gracePeriod=2 Mar 10 14:39:01 crc kubenswrapper[4911]: I0310 14:39:01.935351 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.049571 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p88d\" (UniqueName: \"kubernetes.io/projected/abe372cc-0913-46d0-a2a3-e1f628459e04-kube-api-access-5p88d\") pod \"abe372cc-0913-46d0-a2a3-e1f628459e04\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.049963 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-catalog-content\") pod \"abe372cc-0913-46d0-a2a3-e1f628459e04\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.050037 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-utilities\") pod \"abe372cc-0913-46d0-a2a3-e1f628459e04\" (UID: \"abe372cc-0913-46d0-a2a3-e1f628459e04\") " Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.051319 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-utilities" (OuterVolumeSpecName: "utilities") pod "abe372cc-0913-46d0-a2a3-e1f628459e04" (UID: "abe372cc-0913-46d0-a2a3-e1f628459e04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.057881 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe372cc-0913-46d0-a2a3-e1f628459e04-kube-api-access-5p88d" (OuterVolumeSpecName: "kube-api-access-5p88d") pod "abe372cc-0913-46d0-a2a3-e1f628459e04" (UID: "abe372cc-0913-46d0-a2a3-e1f628459e04"). InnerVolumeSpecName "kube-api-access-5p88d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.152817 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.152851 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p88d\" (UniqueName: \"kubernetes.io/projected/abe372cc-0913-46d0-a2a3-e1f628459e04-kube-api-access-5p88d\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.469798 4911 generic.go:334] "Generic (PLEG): container finished" podID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerID="dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446" exitCode=0 Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.469846 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5q6r" event={"ID":"abe372cc-0913-46d0-a2a3-e1f628459e04","Type":"ContainerDied","Data":"dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446"} Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.469879 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5q6r" event={"ID":"abe372cc-0913-46d0-a2a3-e1f628459e04","Type":"ContainerDied","Data":"cf8e01077d0154a8a630c013415771e83c6b063ac098ad25bc0e9933db2c44bc"} Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.469900 4911 scope.go:117] "RemoveContainer" containerID="dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.469969 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5q6r" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.496161 4911 scope.go:117] "RemoveContainer" containerID="26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.519001 4911 scope.go:117] "RemoveContainer" containerID="33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.566404 4911 scope.go:117] "RemoveContainer" containerID="dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446" Mar 10 14:39:02 crc kubenswrapper[4911]: E0310 14:39:02.567112 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446\": container with ID starting with dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446 not found: ID does not exist" containerID="dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.567164 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446"} err="failed to get container status \"dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446\": rpc error: code = NotFound desc = could not find container \"dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446\": container with ID starting with dc091c221dfdaad010c86e7a2bca84fb720443b5d667fa7d99056baf8ab6a446 not found: ID does not exist" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.567208 4911 scope.go:117] "RemoveContainer" containerID="26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484" Mar 10 14:39:02 crc kubenswrapper[4911]: E0310 14:39:02.567868 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484\": container with ID starting with 26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484 not found: ID does not exist" containerID="26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.567913 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484"} err="failed to get container status \"26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484\": rpc error: code = NotFound desc = could not find container \"26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484\": container with ID starting with 26ac27b81edd54c57fbfe0795e4ae2e0de53614566dc126311b6d4d42f746484 not found: ID does not exist" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.567949 4911 scope.go:117] "RemoveContainer" containerID="33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15" Mar 10 14:39:02 crc kubenswrapper[4911]: E0310 14:39:02.568209 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15\": container with ID starting with 33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15 not found: ID does not exist" containerID="33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.568252 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15"} err="failed to get container status \"33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15\": rpc error: code = NotFound desc = could not find container \"33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15\": container with ID starting with 33f54eb6de1ea49833435aa0a6b29af47e3c78c4379fad9348e5a783bc84bd15 not found: ID does not exist" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.817458 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abe372cc-0913-46d0-a2a3-e1f628459e04" (UID: "abe372cc-0913-46d0-a2a3-e1f628459e04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:39:02 crc kubenswrapper[4911]: I0310 14:39:02.873346 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe372cc-0913-46d0-a2a3-e1f628459e04-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:03 crc kubenswrapper[4911]: I0310 14:39:03.109475 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d5q6r"] Mar 10 14:39:03 crc kubenswrapper[4911]: I0310 14:39:03.118690 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d5q6r"] Mar 10 14:39:03 crc kubenswrapper[4911]: I0310 14:39:03.480437 4911 generic.go:334] "Generic (PLEG): container finished" podID="3d4d304b-5bae-475d-9d99-da422d354bb0" containerID="d073cb37b0517b34d24336502a9fbd70e17c24e1576f05bea1a47b2f2a4f6c00" exitCode=0 Mar 10 14:39:03 crc kubenswrapper[4911]: I0310 14:39:03.480500 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" event={"ID":"3d4d304b-5bae-475d-9d99-da422d354bb0","Type":"ContainerDied","Data":"d073cb37b0517b34d24336502a9fbd70e17c24e1576f05bea1a47b2f2a4f6c00"} Mar 10 14:39:04 crc kubenswrapper[4911]: I0310 14:39:04.210510 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" path="/var/lib/kubelet/pods/abe372cc-0913-46d0-a2a3-e1f628459e04/volumes" Mar 10 14:39:04 crc kubenswrapper[4911]: I0310 14:39:04.928845 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.119854 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ssh-key-openstack-edpm-ipam\") pod \"3d4d304b-5bae-475d-9d99-da422d354bb0\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.119942 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ovn-combined-ca-bundle\") pod \"3d4d304b-5bae-475d-9d99-da422d354bb0\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.120048 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bhnd\" (UniqueName: \"kubernetes.io/projected/3d4d304b-5bae-475d-9d99-da422d354bb0-kube-api-access-4bhnd\") pod \"3d4d304b-5bae-475d-9d99-da422d354bb0\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.120087 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-inventory\") pod \"3d4d304b-5bae-475d-9d99-da422d354bb0\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.120223 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d4d304b-5bae-475d-9d99-da422d354bb0-ovncontroller-config-0\") pod \"3d4d304b-5bae-475d-9d99-da422d354bb0\" (UID: \"3d4d304b-5bae-475d-9d99-da422d354bb0\") " Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.126821 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4d304b-5bae-475d-9d99-da422d354bb0-kube-api-access-4bhnd" (OuterVolumeSpecName: "kube-api-access-4bhnd") pod "3d4d304b-5bae-475d-9d99-da422d354bb0" (UID: "3d4d304b-5bae-475d-9d99-da422d354bb0"). InnerVolumeSpecName "kube-api-access-4bhnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.126992 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3d4d304b-5bae-475d-9d99-da422d354bb0" (UID: "3d4d304b-5bae-475d-9d99-da422d354bb0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.150109 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3d4d304b-5bae-475d-9d99-da422d354bb0" (UID: "3d4d304b-5bae-475d-9d99-da422d354bb0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.155916 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4d304b-5bae-475d-9d99-da422d354bb0-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3d4d304b-5bae-475d-9d99-da422d354bb0" (UID: "3d4d304b-5bae-475d-9d99-da422d354bb0"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.162097 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-inventory" (OuterVolumeSpecName: "inventory") pod "3d4d304b-5bae-475d-9d99-da422d354bb0" (UID: "3d4d304b-5bae-475d-9d99-da422d354bb0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.223187 4911 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d4d304b-5bae-475d-9d99-da422d354bb0-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.223235 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.223254 4911 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.223269 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bhnd\" (UniqueName: \"kubernetes.io/projected/3d4d304b-5bae-475d-9d99-da422d354bb0-kube-api-access-4bhnd\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.223288 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d4d304b-5bae-475d-9d99-da422d354bb0-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.504631 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.504495 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-prhkd" event={"ID":"3d4d304b-5bae-475d-9d99-da422d354bb0","Type":"ContainerDied","Data":"66e6ec17db8d833bd40c8771a9adf002e3a61d956f5866fe6c84940078c6848f"} Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.505248 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e6ec17db8d833bd40c8771a9adf002e3a61d956f5866fe6c84940078c6848f" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.602367 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf"] Mar 10 14:39:05 crc kubenswrapper[4911]: E0310 14:39:05.602874 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerName="extract-utilities" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.602898 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerName="extract-utilities" Mar 10 14:39:05 crc kubenswrapper[4911]: E0310 14:39:05.602916 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerName="registry-server" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.602923 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerName="registry-server" Mar 10 14:39:05 crc kubenswrapper[4911]: E0310 14:39:05.602937 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerName="extract-content" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.602946 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerName="extract-content" Mar 10 14:39:05 crc kubenswrapper[4911]: E0310 14:39:05.602963 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4d304b-5bae-475d-9d99-da422d354bb0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.602969 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4d304b-5bae-475d-9d99-da422d354bb0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.603161 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4d304b-5bae-475d-9d99-da422d354bb0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.603205 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe372cc-0913-46d0-a2a3-e1f628459e04" containerName="registry-server" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.604134 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.608344 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.608831 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.609261 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.609743 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.610194 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.610993 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.634832 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.634956 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdng9\" (UniqueName: \"kubernetes.io/projected/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-kube-api-access-gdng9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.634986 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.635130 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.635163 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.635312 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.635557 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf"] Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.736407 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.736522 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.736651 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.736682 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.736706 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdng9\" (UniqueName: \"kubernetes.io/projected/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-kube-api-access-gdng9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.736758 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.741810 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.742492 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.743385 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.744062 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.744387 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.755697 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdng9\" (UniqueName: \"kubernetes.io/projected/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-kube-api-access-gdng9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:05 crc kubenswrapper[4911]: I0310 14:39:05.935836 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:06 crc kubenswrapper[4911]: I0310 14:39:06.491189 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf"] Mar 10 14:39:06 crc kubenswrapper[4911]: I0310 14:39:06.516078 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" event={"ID":"9fffeac8-b15e-48c2-a04e-7f6b6b28e142","Type":"ContainerStarted","Data":"0384e419f435fc7b632ccb0633da4f0d224572538890e275c28280f7a03a22c7"} Mar 10 14:39:07 crc kubenswrapper[4911]: I0310 14:39:07.528995 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" event={"ID":"9fffeac8-b15e-48c2-a04e-7f6b6b28e142","Type":"ContainerStarted","Data":"a14057558361370be009606660db8bb4bb23b325485ff8719456db76697567a7"} Mar 10 14:39:07 crc kubenswrapper[4911]: I0310 14:39:07.557706 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" podStartSLOduration=2.355930575 podStartE2EDuration="2.557682977s" podCreationTimestamp="2026-03-10 14:39:05 +0000 UTC" firstStartedPulling="2026-03-10 14:39:06.494950723 +0000 UTC m=+2251.058470640" lastFinishedPulling="2026-03-10 14:39:06.696703125 +0000 UTC m=+2251.260223042" observedRunningTime="2026-03-10 14:39:07.545000854 +0000 UTC m=+2252.108520771" watchObservedRunningTime="2026-03-10 14:39:07.557682977 +0000 UTC m=+2252.121202894" Mar 10 14:39:18 crc kubenswrapper[4911]: I0310 14:39:18.520666 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:39:18 crc kubenswrapper[4911]: I0310 14:39:18.521375 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.520640 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.521276 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.521337 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.522502 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.522574 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" gracePeriod=600 Mar 10 14:39:48 crc kubenswrapper[4911]: E0310 14:39:48.649099 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.955112 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" exitCode=0 Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.955223 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233"} Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.955302 4911 scope.go:117] "RemoveContainer" containerID="4cb4b818a888d5417b0612f19c0270a3d874b51a5ee2051ea9d2db487bebc236" Mar 10 14:39:48 crc kubenswrapper[4911]: I0310 14:39:48.958154 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:39:48 crc kubenswrapper[4911]: E0310 14:39:48.959481 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:39:50 crc kubenswrapper[4911]: I0310 14:39:50.981073 4911 generic.go:334] "Generic (PLEG): container finished" podID="9fffeac8-b15e-48c2-a04e-7f6b6b28e142" containerID="a14057558361370be009606660db8bb4bb23b325485ff8719456db76697567a7" exitCode=0 Mar 10 14:39:50 crc kubenswrapper[4911]: I0310 14:39:50.981130 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" event={"ID":"9fffeac8-b15e-48c2-a04e-7f6b6b28e142","Type":"ContainerDied","Data":"a14057558361370be009606660db8bb4bb23b325485ff8719456db76697567a7"} Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.464083 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.613975 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-inventory\") pod \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.615036 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdng9\" (UniqueName: \"kubernetes.io/projected/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-kube-api-access-gdng9\") pod \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.615259 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-metadata-combined-ca-bundle\") pod \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.615518 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-nova-metadata-neutron-config-0\") pod \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.615924 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.616158 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-ssh-key-openstack-edpm-ipam\") pod \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\" (UID: \"9fffeac8-b15e-48c2-a04e-7f6b6b28e142\") " Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.621453 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9fffeac8-b15e-48c2-a04e-7f6b6b28e142" (UID: "9fffeac8-b15e-48c2-a04e-7f6b6b28e142"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.621482 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-kube-api-access-gdng9" (OuterVolumeSpecName: "kube-api-access-gdng9") pod "9fffeac8-b15e-48c2-a04e-7f6b6b28e142" (UID: "9fffeac8-b15e-48c2-a04e-7f6b6b28e142"). InnerVolumeSpecName "kube-api-access-gdng9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.645082 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-inventory" (OuterVolumeSpecName: "inventory") pod "9fffeac8-b15e-48c2-a04e-7f6b6b28e142" (UID: "9fffeac8-b15e-48c2-a04e-7f6b6b28e142"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.645093 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9fffeac8-b15e-48c2-a04e-7f6b6b28e142" (UID: "9fffeac8-b15e-48c2-a04e-7f6b6b28e142"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.645594 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9fffeac8-b15e-48c2-a04e-7f6b6b28e142" (UID: "9fffeac8-b15e-48c2-a04e-7f6b6b28e142"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.651888 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9fffeac8-b15e-48c2-a04e-7f6b6b28e142" (UID: "9fffeac8-b15e-48c2-a04e-7f6b6b28e142"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.720493 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.720862 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdng9\" (UniqueName: \"kubernetes.io/projected/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-kube-api-access-gdng9\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.720886 4911 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.720962 4911 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.720982 4911 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:52 crc kubenswrapper[4911]: I0310 14:39:52.720997 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fffeac8-b15e-48c2-a04e-7f6b6b28e142-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.003846 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" event={"ID":"9fffeac8-b15e-48c2-a04e-7f6b6b28e142","Type":"ContainerDied","Data":"0384e419f435fc7b632ccb0633da4f0d224572538890e275c28280f7a03a22c7"} Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.003918 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0384e419f435fc7b632ccb0633da4f0d224572538890e275c28280f7a03a22c7" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.003946 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.118158 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75"] Mar 10 14:39:53 crc kubenswrapper[4911]: E0310 14:39:53.120820 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fffeac8-b15e-48c2-a04e-7f6b6b28e142" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.120855 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fffeac8-b15e-48c2-a04e-7f6b6b28e142" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.121250 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fffeac8-b15e-48c2-a04e-7f6b6b28e142" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.122357 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.125245 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.125714 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.126201 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.126562 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.126837 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.131022 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75"] Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.232245 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzf6j\" (UniqueName: \"kubernetes.io/projected/2d1eaf3f-414a-426a-8dbf-15825613d50a-kube-api-access-nzf6j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.232332 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.232364 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.232426 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.232504 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.334394 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzf6j\" (UniqueName: \"kubernetes.io/projected/2d1eaf3f-414a-426a-8dbf-15825613d50a-kube-api-access-nzf6j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.334462 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.334497 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.334609 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.334753 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.339330 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.339984 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.341092 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.342392 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.357694 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzf6j\" (UniqueName: \"kubernetes.io/projected/2d1eaf3f-414a-426a-8dbf-15825613d50a-kube-api-access-nzf6j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mrq75\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:53 crc kubenswrapper[4911]: I0310 14:39:53.452827 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:39:54 crc kubenswrapper[4911]: I0310 14:39:54.007998 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75"] Mar 10 14:39:55 crc kubenswrapper[4911]: I0310 14:39:55.029287 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" event={"ID":"2d1eaf3f-414a-426a-8dbf-15825613d50a","Type":"ContainerStarted","Data":"9c9471c09a223d864bf954808f34aae8ed41c9e96227b7a9c1f0c45bddf8523b"} Mar 10 14:39:55 crc kubenswrapper[4911]: I0310 14:39:55.029916 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" event={"ID":"2d1eaf3f-414a-426a-8dbf-15825613d50a","Type":"ContainerStarted","Data":"140dd2a876d990ab00b9b2d835d2f9360ca5c108c207acf9f96c3c5929ab9da8"} Mar 10 14:39:55 crc kubenswrapper[4911]: I0310 14:39:55.051075 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" podStartSLOduration=1.875335835 podStartE2EDuration="2.051044122s" podCreationTimestamp="2026-03-10 14:39:53 +0000 UTC" firstStartedPulling="2026-03-10 14:39:54.015541124 +0000 UTC m=+2298.579061041" lastFinishedPulling="2026-03-10 14:39:54.191249411 +0000 UTC m=+2298.754769328" observedRunningTime="2026-03-10 14:39:55.04692147 +0000 UTC m=+2299.610441387" watchObservedRunningTime="2026-03-10 14:39:55.051044122 +0000 UTC m=+2299.614564039" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.167590 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552560-9k5z5"] Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.174181 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552560-9k5z5" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.178413 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552560-9k5z5"] Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.179779 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.180034 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.180160 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.190798 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75lk5\" (UniqueName: \"kubernetes.io/projected/baff6eff-7b06-496a-843b-995a53744848-kube-api-access-75lk5\") pod \"auto-csr-approver-29552560-9k5z5\" (UID: \"baff6eff-7b06-496a-843b-995a53744848\") " pod="openshift-infra/auto-csr-approver-29552560-9k5z5" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.293010 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75lk5\" (UniqueName: \"kubernetes.io/projected/baff6eff-7b06-496a-843b-995a53744848-kube-api-access-75lk5\") pod \"auto-csr-approver-29552560-9k5z5\" (UID: \"baff6eff-7b06-496a-843b-995a53744848\") " pod="openshift-infra/auto-csr-approver-29552560-9k5z5" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.318764 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75lk5\" (UniqueName: \"kubernetes.io/projected/baff6eff-7b06-496a-843b-995a53744848-kube-api-access-75lk5\") pod \"auto-csr-approver-29552560-9k5z5\" (UID: \"baff6eff-7b06-496a-843b-995a53744848\") " pod="openshift-infra/auto-csr-approver-29552560-9k5z5" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.505514 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552560-9k5z5" Mar 10 14:40:00 crc kubenswrapper[4911]: I0310 14:40:00.966715 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552560-9k5z5"] Mar 10 14:40:01 crc kubenswrapper[4911]: I0310 14:40:01.088962 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552560-9k5z5" event={"ID":"baff6eff-7b06-496a-843b-995a53744848","Type":"ContainerStarted","Data":"f1e3c7631476bc028adac9761081855e6dce8b82d751ad327ea4b2e0a369c834"} Mar 10 14:40:02 crc kubenswrapper[4911]: I0310 14:40:02.194628 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:40:02 crc kubenswrapper[4911]: E0310 14:40:02.195310 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:40:03 crc kubenswrapper[4911]: I0310 14:40:03.111256 4911 generic.go:334] "Generic (PLEG): container finished" podID="baff6eff-7b06-496a-843b-995a53744848" containerID="63debf0da99332146fa4eb94fb27bd691fd0cd020c5f4402a63139473195c69f" exitCode=0 Mar 10 14:40:03 crc kubenswrapper[4911]: I0310 14:40:03.111374 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552560-9k5z5" event={"ID":"baff6eff-7b06-496a-843b-995a53744848","Type":"ContainerDied","Data":"63debf0da99332146fa4eb94fb27bd691fd0cd020c5f4402a63139473195c69f"} Mar 10 14:40:04 crc kubenswrapper[4911]: I0310 14:40:04.539220 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552560-9k5z5" Mar 10 14:40:04 crc kubenswrapper[4911]: I0310 14:40:04.712284 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75lk5\" (UniqueName: \"kubernetes.io/projected/baff6eff-7b06-496a-843b-995a53744848-kube-api-access-75lk5\") pod \"baff6eff-7b06-496a-843b-995a53744848\" (UID: \"baff6eff-7b06-496a-843b-995a53744848\") " Mar 10 14:40:04 crc kubenswrapper[4911]: I0310 14:40:04.730406 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baff6eff-7b06-496a-843b-995a53744848-kube-api-access-75lk5" (OuterVolumeSpecName: "kube-api-access-75lk5") pod "baff6eff-7b06-496a-843b-995a53744848" (UID: "baff6eff-7b06-496a-843b-995a53744848"). InnerVolumeSpecName "kube-api-access-75lk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:40:04 crc kubenswrapper[4911]: I0310 14:40:04.816479 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75lk5\" (UniqueName: \"kubernetes.io/projected/baff6eff-7b06-496a-843b-995a53744848-kube-api-access-75lk5\") on node \"crc\" DevicePath \"\"" Mar 10 14:40:05 crc kubenswrapper[4911]: I0310 14:40:05.150129 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552560-9k5z5" event={"ID":"baff6eff-7b06-496a-843b-995a53744848","Type":"ContainerDied","Data":"f1e3c7631476bc028adac9761081855e6dce8b82d751ad327ea4b2e0a369c834"} Mar 10 14:40:05 crc kubenswrapper[4911]: I0310 14:40:05.150568 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1e3c7631476bc028adac9761081855e6dce8b82d751ad327ea4b2e0a369c834" Mar 10 14:40:05 crc kubenswrapper[4911]: I0310 14:40:05.150230 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552560-9k5z5" Mar 10 14:40:05 crc kubenswrapper[4911]: I0310 14:40:05.626385 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552554-t65c4"] Mar 10 14:40:05 crc kubenswrapper[4911]: I0310 14:40:05.635116 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552554-t65c4"] Mar 10 14:40:06 crc kubenswrapper[4911]: I0310 14:40:06.212651 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91a7cec-4b2e-4ec7-bb08-f5d6043394cc" path="/var/lib/kubelet/pods/e91a7cec-4b2e-4ec7-bb08-f5d6043394cc/volumes" Mar 10 14:40:13 crc kubenswrapper[4911]: I0310 14:40:13.194499 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:40:13 crc kubenswrapper[4911]: E0310 14:40:13.195193 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:40:26 crc kubenswrapper[4911]: I0310 14:40:26.203891 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:40:26 crc kubenswrapper[4911]: E0310 14:40:26.205016 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:40:39 crc kubenswrapper[4911]: I0310 14:40:39.194317 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:40:39 crc kubenswrapper[4911]: E0310 14:40:39.195177 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.494386 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4wdp"] Mar 10 14:40:49 crc kubenswrapper[4911]: E0310 14:40:49.495626 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baff6eff-7b06-496a-843b-995a53744848" containerName="oc" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.495647 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="baff6eff-7b06-496a-843b-995a53744848" containerName="oc" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.495928 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="baff6eff-7b06-496a-843b-995a53744848" containerName="oc" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.502266 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.523692 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4wdp"] Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.602432 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrmz5\" (UniqueName: \"kubernetes.io/projected/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-kube-api-access-jrmz5\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.602511 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-utilities\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.602591 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-catalog-content\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.704941 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrmz5\" (UniqueName: \"kubernetes.io/projected/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-kube-api-access-jrmz5\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.705058 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-utilities\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.705214 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-catalog-content\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.705672 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-utilities\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.705759 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-catalog-content\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.729022 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrmz5\" (UniqueName: \"kubernetes.io/projected/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-kube-api-access-jrmz5\") pod \"certified-operators-p4wdp\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:49 crc kubenswrapper[4911]: I0310 14:40:49.836965 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:50 crc kubenswrapper[4911]: I0310 14:40:50.395628 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4wdp"] Mar 10 14:40:50 crc kubenswrapper[4911]: I0310 14:40:50.609462 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4wdp" event={"ID":"1e598801-24bf-42b2-bdf9-0a2933c7a2a5","Type":"ContainerStarted","Data":"cf523a4c571a77dff45781fabd036fdefd30496f97a155b400140ff726367079"} Mar 10 14:40:51 crc kubenswrapper[4911]: I0310 14:40:51.621036 4911 generic.go:334] "Generic (PLEG): container finished" podID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerID="31571360feff5618be2e17cd09d30a0aa4c6a2438b9a0f3e30a4164e554768d0" exitCode=0 Mar 10 14:40:51 crc kubenswrapper[4911]: I0310 14:40:51.621099 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4wdp" event={"ID":"1e598801-24bf-42b2-bdf9-0a2933c7a2a5","Type":"ContainerDied","Data":"31571360feff5618be2e17cd09d30a0aa4c6a2438b9a0f3e30a4164e554768d0"} Mar 10 14:40:52 crc kubenswrapper[4911]: I0310 14:40:52.633399 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4wdp" event={"ID":"1e598801-24bf-42b2-bdf9-0a2933c7a2a5","Type":"ContainerStarted","Data":"07e5192a487303d893800527b670d88be39fc9185bf2d0c5283021fd4d6772af"} Mar 10 14:40:53 crc kubenswrapper[4911]: I0310 14:40:53.194238 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:40:53 crc kubenswrapper[4911]: E0310 14:40:53.195012 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:40:53 crc kubenswrapper[4911]: I0310 14:40:53.644846 4911 generic.go:334] "Generic (PLEG): container finished" podID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerID="07e5192a487303d893800527b670d88be39fc9185bf2d0c5283021fd4d6772af" exitCode=0 Mar 10 14:40:53 crc kubenswrapper[4911]: I0310 14:40:53.644893 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4wdp" event={"ID":"1e598801-24bf-42b2-bdf9-0a2933c7a2a5","Type":"ContainerDied","Data":"07e5192a487303d893800527b670d88be39fc9185bf2d0c5283021fd4d6772af"} Mar 10 14:40:54 crc kubenswrapper[4911]: I0310 14:40:54.658150 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4wdp" event={"ID":"1e598801-24bf-42b2-bdf9-0a2933c7a2a5","Type":"ContainerStarted","Data":"28183eeb0abe3123897d6e9a36ab8e6f288cc23ebc498379a315b1c055d99009"} Mar 10 14:40:54 crc kubenswrapper[4911]: I0310 14:40:54.689454 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4wdp" podStartSLOduration=3.219343008 podStartE2EDuration="5.689420472s" podCreationTimestamp="2026-03-10 14:40:49 +0000 UTC" firstStartedPulling="2026-03-10 14:40:51.623536999 +0000 UTC m=+2356.187056916" lastFinishedPulling="2026-03-10 14:40:54.093614463 +0000 UTC m=+2358.657134380" observedRunningTime="2026-03-10 14:40:54.680137451 +0000 UTC m=+2359.243657358" watchObservedRunningTime="2026-03-10 14:40:54.689420472 +0000 UTC m=+2359.252940379" Mar 10 14:40:56 crc kubenswrapper[4911]: I0310 14:40:56.104230 4911 scope.go:117] "RemoveContainer" containerID="ef827851c33cd04d9615d1757eac7601489214572902f15b804fa36a54086471" Mar 10 14:40:59 crc kubenswrapper[4911]: I0310 14:40:59.837589 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:59 crc kubenswrapper[4911]: I0310 14:40:59.838009 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:40:59 crc kubenswrapper[4911]: I0310 14:40:59.886627 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:41:00 crc kubenswrapper[4911]: I0310 14:41:00.760701 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:41:00 crc kubenswrapper[4911]: I0310 14:41:00.813284 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4wdp"] Mar 10 14:41:02 crc kubenswrapper[4911]: I0310 14:41:02.747128 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4wdp" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerName="registry-server" containerID="cri-o://28183eeb0abe3123897d6e9a36ab8e6f288cc23ebc498379a315b1c055d99009" gracePeriod=2 Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.764916 4911 generic.go:334] "Generic (PLEG): container finished" podID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerID="28183eeb0abe3123897d6e9a36ab8e6f288cc23ebc498379a315b1c055d99009" exitCode=0 Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.764981 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4wdp" event={"ID":"1e598801-24bf-42b2-bdf9-0a2933c7a2a5","Type":"ContainerDied","Data":"28183eeb0abe3123897d6e9a36ab8e6f288cc23ebc498379a315b1c055d99009"} Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.765368 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4wdp" event={"ID":"1e598801-24bf-42b2-bdf9-0a2933c7a2a5","Type":"ContainerDied","Data":"cf523a4c571a77dff45781fabd036fdefd30496f97a155b400140ff726367079"} Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.765396 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf523a4c571a77dff45781fabd036fdefd30496f97a155b400140ff726367079" Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.818633 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.934524 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-utilities\") pod \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.934622 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-catalog-content\") pod \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.934657 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrmz5\" (UniqueName: \"kubernetes.io/projected/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-kube-api-access-jrmz5\") pod \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\" (UID: \"1e598801-24bf-42b2-bdf9-0a2933c7a2a5\") " Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.935892 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-utilities" (OuterVolumeSpecName: "utilities") pod "1e598801-24bf-42b2-bdf9-0a2933c7a2a5" (UID: "1e598801-24bf-42b2-bdf9-0a2933c7a2a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:41:03 crc kubenswrapper[4911]: I0310 14:41:03.944309 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-kube-api-access-jrmz5" (OuterVolumeSpecName: "kube-api-access-jrmz5") pod "1e598801-24bf-42b2-bdf9-0a2933c7a2a5" (UID: "1e598801-24bf-42b2-bdf9-0a2933c7a2a5"). InnerVolumeSpecName "kube-api-access-jrmz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:41:04 crc kubenswrapper[4911]: I0310 14:41:04.037560 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:41:04 crc kubenswrapper[4911]: I0310 14:41:04.037608 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrmz5\" (UniqueName: \"kubernetes.io/projected/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-kube-api-access-jrmz5\") on node \"crc\" DevicePath \"\"" Mar 10 14:41:04 crc kubenswrapper[4911]: I0310 14:41:04.346484 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e598801-24bf-42b2-bdf9-0a2933c7a2a5" (UID: "1e598801-24bf-42b2-bdf9-0a2933c7a2a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:41:04 crc kubenswrapper[4911]: I0310 14:41:04.445391 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e598801-24bf-42b2-bdf9-0a2933c7a2a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:41:04 crc kubenswrapper[4911]: I0310 14:41:04.774546 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4wdp" Mar 10 14:41:04 crc kubenswrapper[4911]: I0310 14:41:04.827077 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4wdp"] Mar 10 14:41:04 crc kubenswrapper[4911]: I0310 14:41:04.836014 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4wdp"] Mar 10 14:41:06 crc kubenswrapper[4911]: I0310 14:41:06.199985 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:41:06 crc kubenswrapper[4911]: E0310 14:41:06.200397 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:41:06 crc kubenswrapper[4911]: I0310 14:41:06.208375 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" path="/var/lib/kubelet/pods/1e598801-24bf-42b2-bdf9-0a2933c7a2a5/volumes" Mar 10 14:41:18 crc kubenswrapper[4911]: I0310 14:41:18.193821 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:41:18 crc kubenswrapper[4911]: E0310 14:41:18.194823 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:41:31 crc kubenswrapper[4911]: I0310 14:41:31.194005 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:41:31 crc kubenswrapper[4911]: E0310 14:41:31.194962 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:41:44 crc kubenswrapper[4911]: I0310 14:41:44.193429 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:41:44 crc kubenswrapper[4911]: E0310 14:41:44.194378 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:41:57 crc kubenswrapper[4911]: I0310 14:41:57.193901 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:41:57 crc kubenswrapper[4911]: E0310 14:41:57.194752 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.150549 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552562-xsfhx"] Mar 10 14:42:00 crc kubenswrapper[4911]: E0310 14:42:00.151215 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerName="registry-server" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.151233 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerName="registry-server" Mar 10 14:42:00 crc kubenswrapper[4911]: E0310 14:42:00.151260 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerName="extract-content" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.151266 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerName="extract-content" Mar 10 14:42:00 crc kubenswrapper[4911]: E0310 14:42:00.151290 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerName="extract-utilities" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.151303 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerName="extract-utilities" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.151516 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e598801-24bf-42b2-bdf9-0a2933c7a2a5" containerName="registry-server" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.152545 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552562-xsfhx" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.158531 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.158621 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.159220 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.161893 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552562-xsfhx"] Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.175362 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hkxq\" (UniqueName: \"kubernetes.io/projected/15ed8fcd-f05b-488a-ac3e-4becbe638683-kube-api-access-5hkxq\") pod \"auto-csr-approver-29552562-xsfhx\" (UID: \"15ed8fcd-f05b-488a-ac3e-4becbe638683\") " pod="openshift-infra/auto-csr-approver-29552562-xsfhx" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.277781 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hkxq\" (UniqueName: \"kubernetes.io/projected/15ed8fcd-f05b-488a-ac3e-4becbe638683-kube-api-access-5hkxq\") pod \"auto-csr-approver-29552562-xsfhx\" (UID: \"15ed8fcd-f05b-488a-ac3e-4becbe638683\") " pod="openshift-infra/auto-csr-approver-29552562-xsfhx" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.303008 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hkxq\" (UniqueName: \"kubernetes.io/projected/15ed8fcd-f05b-488a-ac3e-4becbe638683-kube-api-access-5hkxq\") pod \"auto-csr-approver-29552562-xsfhx\" (UID: \"15ed8fcd-f05b-488a-ac3e-4becbe638683\") " pod="openshift-infra/auto-csr-approver-29552562-xsfhx" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.477139 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552562-xsfhx" Mar 10 14:42:00 crc kubenswrapper[4911]: I0310 14:42:00.937641 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552562-xsfhx"] Mar 10 14:42:01 crc kubenswrapper[4911]: I0310 14:42:01.345347 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552562-xsfhx" event={"ID":"15ed8fcd-f05b-488a-ac3e-4becbe638683","Type":"ContainerStarted","Data":"5f3bd9436d2f48f39cc63cc5b1e8d6332d4ad61eec6d966ebaab2c980c8c76d6"} Mar 10 14:42:03 crc kubenswrapper[4911]: I0310 14:42:03.370519 4911 generic.go:334] "Generic (PLEG): container finished" podID="15ed8fcd-f05b-488a-ac3e-4becbe638683" containerID="7d2388e7593019c70700d0c2a360b0225d0387c52f7b1c6a0802f3f23fb1fe5c" exitCode=0 Mar 10 14:42:03 crc kubenswrapper[4911]: I0310 14:42:03.371621 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552562-xsfhx" event={"ID":"15ed8fcd-f05b-488a-ac3e-4becbe638683","Type":"ContainerDied","Data":"7d2388e7593019c70700d0c2a360b0225d0387c52f7b1c6a0802f3f23fb1fe5c"} Mar 10 14:42:04 crc kubenswrapper[4911]: I0310 14:42:04.791761 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552562-xsfhx" Mar 10 14:42:04 crc kubenswrapper[4911]: I0310 14:42:04.981839 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hkxq\" (UniqueName: \"kubernetes.io/projected/15ed8fcd-f05b-488a-ac3e-4becbe638683-kube-api-access-5hkxq\") pod \"15ed8fcd-f05b-488a-ac3e-4becbe638683\" (UID: \"15ed8fcd-f05b-488a-ac3e-4becbe638683\") " Mar 10 14:42:04 crc kubenswrapper[4911]: I0310 14:42:04.994330 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ed8fcd-f05b-488a-ac3e-4becbe638683-kube-api-access-5hkxq" (OuterVolumeSpecName: "kube-api-access-5hkxq") pod "15ed8fcd-f05b-488a-ac3e-4becbe638683" (UID: "15ed8fcd-f05b-488a-ac3e-4becbe638683"). InnerVolumeSpecName "kube-api-access-5hkxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:42:05 crc kubenswrapper[4911]: I0310 14:42:05.084854 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hkxq\" (UniqueName: \"kubernetes.io/projected/15ed8fcd-f05b-488a-ac3e-4becbe638683-kube-api-access-5hkxq\") on node \"crc\" DevicePath \"\"" Mar 10 14:42:05 crc kubenswrapper[4911]: I0310 14:42:05.391156 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552562-xsfhx" event={"ID":"15ed8fcd-f05b-488a-ac3e-4becbe638683","Type":"ContainerDied","Data":"5f3bd9436d2f48f39cc63cc5b1e8d6332d4ad61eec6d966ebaab2c980c8c76d6"} Mar 10 14:42:05 crc kubenswrapper[4911]: I0310 14:42:05.391202 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f3bd9436d2f48f39cc63cc5b1e8d6332d4ad61eec6d966ebaab2c980c8c76d6" Mar 10 14:42:05 crc kubenswrapper[4911]: I0310 14:42:05.391582 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552562-xsfhx" Mar 10 14:42:05 crc kubenswrapper[4911]: I0310 14:42:05.870766 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552556-sltw5"] Mar 10 14:42:05 crc kubenswrapper[4911]: I0310 14:42:05.879405 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552556-sltw5"] Mar 10 14:42:06 crc kubenswrapper[4911]: I0310 14:42:06.209865 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9dba33-a3f9-4a30-9c5a-8e9871a09592" path="/var/lib/kubelet/pods/0c9dba33-a3f9-4a30-9c5a-8e9871a09592/volumes" Mar 10 14:42:12 crc kubenswrapper[4911]: I0310 14:42:12.193947 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:42:12 crc kubenswrapper[4911]: E0310 14:42:12.194922 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:42:23 crc kubenswrapper[4911]: I0310 14:42:23.194312 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:42:23 crc kubenswrapper[4911]: E0310 14:42:23.195353 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:42:37 crc kubenswrapper[4911]: I0310 14:42:37.194336 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:42:37 crc kubenswrapper[4911]: E0310 14:42:37.195196 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:42:48 crc kubenswrapper[4911]: I0310 14:42:48.194026 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:42:48 crc kubenswrapper[4911]: E0310 14:42:48.195360 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:42:56 crc kubenswrapper[4911]: I0310 14:42:56.223568 4911 scope.go:117] "RemoveContainer" containerID="894d55192e067d421967c5bec84669bcbfa6e647b27d3ec2a50d8f86b98f1dfe" Mar 10 14:43:03 crc kubenswrapper[4911]: I0310 14:43:03.193380 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:43:03 crc kubenswrapper[4911]: E0310 14:43:03.194331 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:43:14 crc kubenswrapper[4911]: I0310 14:43:14.194331 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:43:14 crc kubenswrapper[4911]: E0310 14:43:14.195412 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:43:25 crc kubenswrapper[4911]: I0310 14:43:25.194021 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:43:25 crc kubenswrapper[4911]: E0310 14:43:25.195210 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:43:37 crc kubenswrapper[4911]: I0310 14:43:37.194342 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:43:37 crc kubenswrapper[4911]: E0310 14:43:37.195144 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:43:44 crc kubenswrapper[4911]: I0310 14:43:44.443483 4911 generic.go:334] "Generic (PLEG): container finished" podID="2d1eaf3f-414a-426a-8dbf-15825613d50a" containerID="9c9471c09a223d864bf954808f34aae8ed41c9e96227b7a9c1f0c45bddf8523b" exitCode=0 Mar 10 14:43:44 crc kubenswrapper[4911]: I0310 14:43:44.443589 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" event={"ID":"2d1eaf3f-414a-426a-8dbf-15825613d50a","Type":"ContainerDied","Data":"9c9471c09a223d864bf954808f34aae8ed41c9e96227b7a9c1f0c45bddf8523b"} Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.842424 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.935356 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-inventory\") pod \"2d1eaf3f-414a-426a-8dbf-15825613d50a\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.935420 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzf6j\" (UniqueName: \"kubernetes.io/projected/2d1eaf3f-414a-426a-8dbf-15825613d50a-kube-api-access-nzf6j\") pod \"2d1eaf3f-414a-426a-8dbf-15825613d50a\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.935483 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-ssh-key-openstack-edpm-ipam\") pod \"2d1eaf3f-414a-426a-8dbf-15825613d50a\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.935550 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-combined-ca-bundle\") pod \"2d1eaf3f-414a-426a-8dbf-15825613d50a\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.935604 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-secret-0\") pod \"2d1eaf3f-414a-426a-8dbf-15825613d50a\" (UID: \"2d1eaf3f-414a-426a-8dbf-15825613d50a\") " Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.941504 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1eaf3f-414a-426a-8dbf-15825613d50a-kube-api-access-nzf6j" (OuterVolumeSpecName: "kube-api-access-nzf6j") pod "2d1eaf3f-414a-426a-8dbf-15825613d50a" (UID: "2d1eaf3f-414a-426a-8dbf-15825613d50a"). InnerVolumeSpecName "kube-api-access-nzf6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.942594 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2d1eaf3f-414a-426a-8dbf-15825613d50a" (UID: "2d1eaf3f-414a-426a-8dbf-15825613d50a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.968359 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-inventory" (OuterVolumeSpecName: "inventory") pod "2d1eaf3f-414a-426a-8dbf-15825613d50a" (UID: "2d1eaf3f-414a-426a-8dbf-15825613d50a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.970172 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2d1eaf3f-414a-426a-8dbf-15825613d50a" (UID: "2d1eaf3f-414a-426a-8dbf-15825613d50a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:43:45 crc kubenswrapper[4911]: I0310 14:43:45.971091 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2d1eaf3f-414a-426a-8dbf-15825613d50a" (UID: "2d1eaf3f-414a-426a-8dbf-15825613d50a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.037897 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.037930 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzf6j\" (UniqueName: \"kubernetes.io/projected/2d1eaf3f-414a-426a-8dbf-15825613d50a-kube-api-access-nzf6j\") on node \"crc\" DevicePath \"\"" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.037943 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.037952 4911 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.037962 4911 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d1eaf3f-414a-426a-8dbf-15825613d50a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.465451 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" event={"ID":"2d1eaf3f-414a-426a-8dbf-15825613d50a","Type":"ContainerDied","Data":"140dd2a876d990ab00b9b2d835d2f9360ca5c108c207acf9f96c3c5929ab9da8"} Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.465499 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mrq75" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.465508 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="140dd2a876d990ab00b9b2d835d2f9360ca5c108c207acf9f96c3c5929ab9da8" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.645850 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc"] Mar 10 14:43:46 crc kubenswrapper[4911]: E0310 14:43:46.646436 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ed8fcd-f05b-488a-ac3e-4becbe638683" containerName="oc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.646462 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ed8fcd-f05b-488a-ac3e-4becbe638683" containerName="oc" Mar 10 14:43:46 crc kubenswrapper[4911]: E0310 14:43:46.646476 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1eaf3f-414a-426a-8dbf-15825613d50a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.646485 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1eaf3f-414a-426a-8dbf-15825613d50a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.646752 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ed8fcd-f05b-488a-ac3e-4becbe638683" containerName="oc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.646786 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1eaf3f-414a-426a-8dbf-15825613d50a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.647660 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.652181 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.652419 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.652945 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.653350 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.653468 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.653578 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.654306 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.661498 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc"] Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.754657 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.754767 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.754800 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.754827 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.754850 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48gx\" (UniqueName: \"kubernetes.io/projected/43b7e07c-895e-46e1-9863-4dc4845a72ea-kube-api-access-h48gx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.754895 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.754943 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.754984 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.755057 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.755080 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.755102 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.857955 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858055 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858093 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858120 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858148 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h48gx\" (UniqueName: \"kubernetes.io/projected/43b7e07c-895e-46e1-9863-4dc4845a72ea-kube-api-access-h48gx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858188 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858221 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858251 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858307 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858331 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.858349 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.859557 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.865444 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.865973 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.866333 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.866461 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.867798 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.867842 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.868429 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.868489 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.876473 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.885149 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48gx\" (UniqueName: \"kubernetes.io/projected/43b7e07c-895e-46e1-9863-4dc4845a72ea-kube-api-access-h48gx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dksjc\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:46 crc kubenswrapper[4911]: I0310 14:43:46.965253 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:43:47 crc kubenswrapper[4911]: I0310 14:43:47.522101 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc"] Mar 10 14:43:47 crc kubenswrapper[4911]: I0310 14:43:47.524992 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:43:48 crc kubenswrapper[4911]: I0310 14:43:48.492074 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" event={"ID":"43b7e07c-895e-46e1-9863-4dc4845a72ea","Type":"ContainerStarted","Data":"d9b5d5630603aae2715d6ab2a7dbc7fdc46c0a7a5b47d7d416fabd5d932e771c"} Mar 10 14:43:48 crc kubenswrapper[4911]: I0310 14:43:48.492488 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" event={"ID":"43b7e07c-895e-46e1-9863-4dc4845a72ea","Type":"ContainerStarted","Data":"2ead0d7aca353b5f804366afa0218774c42ae2a51e3e4f367dfa0c794f5b2b74"} Mar 10 14:43:48 crc kubenswrapper[4911]: I0310 14:43:48.528244 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" podStartSLOduration=2.321744747 podStartE2EDuration="2.528209056s" podCreationTimestamp="2026-03-10 14:43:46 +0000 UTC" firstStartedPulling="2026-03-10 14:43:47.524789193 +0000 UTC m=+2532.088309100" lastFinishedPulling="2026-03-10 14:43:47.731253492 +0000 UTC m=+2532.294773409" observedRunningTime="2026-03-10 14:43:48.518679649 +0000 UTC m=+2533.082199576" watchObservedRunningTime="2026-03-10 14:43:48.528209056 +0000 UTC m=+2533.091728963" Mar 10 14:43:49 crc kubenswrapper[4911]: I0310 14:43:49.194583 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:43:49 crc kubenswrapper[4911]: E0310 14:43:49.195105 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.146645 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552564-hqwcw"] Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.148846 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552564-hqwcw" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.151402 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.151947 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.152196 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.160103 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552564-hqwcw"] Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.264343 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srn55\" (UniqueName: \"kubernetes.io/projected/63bcf83b-001c-4b6a-9aa1-df8fbd2f5491-kube-api-access-srn55\") pod \"auto-csr-approver-29552564-hqwcw\" (UID: \"63bcf83b-001c-4b6a-9aa1-df8fbd2f5491\") " pod="openshift-infra/auto-csr-approver-29552564-hqwcw" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.366671 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srn55\" (UniqueName: \"kubernetes.io/projected/63bcf83b-001c-4b6a-9aa1-df8fbd2f5491-kube-api-access-srn55\") pod \"auto-csr-approver-29552564-hqwcw\" (UID: \"63bcf83b-001c-4b6a-9aa1-df8fbd2f5491\") " pod="openshift-infra/auto-csr-approver-29552564-hqwcw" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.387343 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srn55\" (UniqueName: \"kubernetes.io/projected/63bcf83b-001c-4b6a-9aa1-df8fbd2f5491-kube-api-access-srn55\") pod \"auto-csr-approver-29552564-hqwcw\" (UID: \"63bcf83b-001c-4b6a-9aa1-df8fbd2f5491\") " pod="openshift-infra/auto-csr-approver-29552564-hqwcw" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.468797 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552564-hqwcw" Mar 10 14:44:00 crc kubenswrapper[4911]: I0310 14:44:00.934862 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552564-hqwcw"] Mar 10 14:44:01 crc kubenswrapper[4911]: I0310 14:44:01.628558 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552564-hqwcw" event={"ID":"63bcf83b-001c-4b6a-9aa1-df8fbd2f5491","Type":"ContainerStarted","Data":"f5abb5d2faf3f3ecf13ec7f5da1e7cff22de0133e092d543aa9c9b6fa4414c72"} Mar 10 14:44:02 crc kubenswrapper[4911]: I0310 14:44:02.638075 4911 generic.go:334] "Generic (PLEG): container finished" podID="63bcf83b-001c-4b6a-9aa1-df8fbd2f5491" containerID="a2f2f014bcfa233463f20f997ea0c30c6ee07019fe84aea19b9e64445fb18fe3" exitCode=0 Mar 10 14:44:02 crc kubenswrapper[4911]: I0310 14:44:02.638192 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552564-hqwcw" event={"ID":"63bcf83b-001c-4b6a-9aa1-df8fbd2f5491","Type":"ContainerDied","Data":"a2f2f014bcfa233463f20f997ea0c30c6ee07019fe84aea19b9e64445fb18fe3"} Mar 10 14:44:03 crc kubenswrapper[4911]: I0310 14:44:03.961013 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552564-hqwcw" Mar 10 14:44:04 crc kubenswrapper[4911]: I0310 14:44:04.048004 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srn55\" (UniqueName: \"kubernetes.io/projected/63bcf83b-001c-4b6a-9aa1-df8fbd2f5491-kube-api-access-srn55\") pod \"63bcf83b-001c-4b6a-9aa1-df8fbd2f5491\" (UID: \"63bcf83b-001c-4b6a-9aa1-df8fbd2f5491\") " Mar 10 14:44:04 crc kubenswrapper[4911]: I0310 14:44:04.054836 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bcf83b-001c-4b6a-9aa1-df8fbd2f5491-kube-api-access-srn55" (OuterVolumeSpecName: "kube-api-access-srn55") pod "63bcf83b-001c-4b6a-9aa1-df8fbd2f5491" (UID: "63bcf83b-001c-4b6a-9aa1-df8fbd2f5491"). InnerVolumeSpecName "kube-api-access-srn55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:44:04 crc kubenswrapper[4911]: I0310 14:44:04.151084 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srn55\" (UniqueName: \"kubernetes.io/projected/63bcf83b-001c-4b6a-9aa1-df8fbd2f5491-kube-api-access-srn55\") on node \"crc\" DevicePath \"\"" Mar 10 14:44:04 crc kubenswrapper[4911]: I0310 14:44:04.194171 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:44:04 crc kubenswrapper[4911]: E0310 14:44:04.194558 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:44:04 crc kubenswrapper[4911]: I0310 14:44:04.661045 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552564-hqwcw" event={"ID":"63bcf83b-001c-4b6a-9aa1-df8fbd2f5491","Type":"ContainerDied","Data":"f5abb5d2faf3f3ecf13ec7f5da1e7cff22de0133e092d543aa9c9b6fa4414c72"} Mar 10 14:44:04 crc kubenswrapper[4911]: I0310 14:44:04.661122 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5abb5d2faf3f3ecf13ec7f5da1e7cff22de0133e092d543aa9c9b6fa4414c72" Mar 10 14:44:04 crc kubenswrapper[4911]: I0310 14:44:04.661237 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552564-hqwcw" Mar 10 14:44:05 crc kubenswrapper[4911]: I0310 14:44:05.049427 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552558-n9c5z"] Mar 10 14:44:05 crc kubenswrapper[4911]: I0310 14:44:05.057159 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552558-n9c5z"] Mar 10 14:44:06 crc kubenswrapper[4911]: I0310 14:44:06.217361 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f99285f-2c0b-4a1a-920b-1daa90b49b19" path="/var/lib/kubelet/pods/3f99285f-2c0b-4a1a-920b-1daa90b49b19/volumes" Mar 10 14:44:19 crc kubenswrapper[4911]: I0310 14:44:19.193621 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:44:19 crc kubenswrapper[4911]: E0310 14:44:19.195218 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:44:31 crc kubenswrapper[4911]: I0310 14:44:31.193602 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:44:31 crc kubenswrapper[4911]: E0310 14:44:31.194782 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:44:46 crc kubenswrapper[4911]: I0310 14:44:46.200113 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:44:46 crc kubenswrapper[4911]: E0310 14:44:46.200818 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:44:56 crc kubenswrapper[4911]: I0310 14:44:56.323525 4911 scope.go:117] "RemoveContainer" containerID="7f21fef8a6e95af340b78a09afc4026e5c47b882f3368268a824f9a355c4ff35" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.154285 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv"] Mar 10 14:45:00 crc kubenswrapper[4911]: E0310 14:45:00.155292 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bcf83b-001c-4b6a-9aa1-df8fbd2f5491" containerName="oc" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.155307 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bcf83b-001c-4b6a-9aa1-df8fbd2f5491" containerName="oc" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.155494 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="63bcf83b-001c-4b6a-9aa1-df8fbd2f5491" containerName="oc" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.156509 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.158634 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.159221 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.163812 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv"] Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.195419 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.329980 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-config-volume\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.330276 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-secret-volume\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.330303 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnt48\" (UniqueName: \"kubernetes.io/projected/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-kube-api-access-jnt48\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.432601 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-secret-volume\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.432672 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnt48\" (UniqueName: \"kubernetes.io/projected/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-kube-api-access-jnt48\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.432866 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-config-volume\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.433790 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-config-volume\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.444165 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-secret-volume\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.453263 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnt48\" (UniqueName: \"kubernetes.io/projected/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-kube-api-access-jnt48\") pod \"collect-profiles-29552565-zcdmv\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.483130 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:00 crc kubenswrapper[4911]: I0310 14:45:00.963211 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv"] Mar 10 14:45:01 crc kubenswrapper[4911]: I0310 14:45:01.205879 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"c92603dfaddc182b1a9ac490086f6b37810dc88d297b219df86ab89393789379"} Mar 10 14:45:01 crc kubenswrapper[4911]: I0310 14:45:01.215684 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" event={"ID":"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e","Type":"ContainerStarted","Data":"e810240384d5f65830b116bd442558d3b5afb0e63909f00da4d73d149ea1c6b3"} Mar 10 14:45:01 crc kubenswrapper[4911]: I0310 14:45:01.215981 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" event={"ID":"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e","Type":"ContainerStarted","Data":"4e19e97aaf868a4618027d10001fe7eeda958c993fbf1859d1dc8d67ef28f556"} Mar 10 14:45:01 crc kubenswrapper[4911]: I0310 14:45:01.276194 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" podStartSLOduration=1.276167459 podStartE2EDuration="1.276167459s" podCreationTimestamp="2026-03-10 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 14:45:01.273056836 +0000 UTC m=+2605.836576753" watchObservedRunningTime="2026-03-10 14:45:01.276167459 +0000 UTC m=+2605.839687376" Mar 10 14:45:02 crc kubenswrapper[4911]: I0310 14:45:02.231514 4911 generic.go:334] "Generic (PLEG): container finished" podID="d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e" containerID="e810240384d5f65830b116bd442558d3b5afb0e63909f00da4d73d149ea1c6b3" exitCode=0 Mar 10 14:45:02 crc kubenswrapper[4911]: I0310 14:45:02.231604 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" event={"ID":"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e","Type":"ContainerDied","Data":"e810240384d5f65830b116bd442558d3b5afb0e63909f00da4d73d149ea1c6b3"} Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.570153 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.618053 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-config-volume\") pod \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.618439 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnt48\" (UniqueName: \"kubernetes.io/projected/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-kube-api-access-jnt48\") pod \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.618533 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-secret-volume\") pod \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\" (UID: \"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e\") " Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.618907 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-config-volume" (OuterVolumeSpecName: "config-volume") pod "d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e" (UID: "d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.619224 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.625249 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-kube-api-access-jnt48" (OuterVolumeSpecName: "kube-api-access-jnt48") pod "d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e" (UID: "d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e"). InnerVolumeSpecName "kube-api-access-jnt48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.625514 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e" (UID: "d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.721650 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnt48\" (UniqueName: \"kubernetes.io/projected/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-kube-api-access-jnt48\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:03 crc kubenswrapper[4911]: I0310 14:45:03.721700 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:04 crc kubenswrapper[4911]: I0310 14:45:04.251114 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" event={"ID":"d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e","Type":"ContainerDied","Data":"4e19e97aaf868a4618027d10001fe7eeda958c993fbf1859d1dc8d67ef28f556"} Mar 10 14:45:04 crc kubenswrapper[4911]: I0310 14:45:04.251165 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552565-zcdmv" Mar 10 14:45:04 crc kubenswrapper[4911]: I0310 14:45:04.251169 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e19e97aaf868a4618027d10001fe7eeda958c993fbf1859d1dc8d67ef28f556" Mar 10 14:45:04 crc kubenswrapper[4911]: I0310 14:45:04.671774 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2"] Mar 10 14:45:04 crc kubenswrapper[4911]: I0310 14:45:04.679604 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552520-b2vs2"] Mar 10 14:45:06 crc kubenswrapper[4911]: I0310 14:45:06.212511 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d20f65c-3043-4310-83f1-300d0283f9b4" path="/var/lib/kubelet/pods/0d20f65c-3043-4310-83f1-300d0283f9b4/volumes" Mar 10 14:45:56 crc kubenswrapper[4911]: I0310 14:45:56.408075 4911 scope.go:117] "RemoveContainer" containerID="06fde4e3bfd3899571ef27f1d700ec6e54d75a1071ddfb599bdb42f85933d32a" Mar 10 14:45:56 crc kubenswrapper[4911]: I0310 14:45:56.778272 4911 generic.go:334] "Generic (PLEG): container finished" podID="43b7e07c-895e-46e1-9863-4dc4845a72ea" containerID="d9b5d5630603aae2715d6ab2a7dbc7fdc46c0a7a5b47d7d416fabd5d932e771c" exitCode=0 Mar 10 14:45:56 crc kubenswrapper[4911]: I0310 14:45:56.778369 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" event={"ID":"43b7e07c-895e-46e1-9863-4dc4845a72ea","Type":"ContainerDied","Data":"d9b5d5630603aae2715d6ab2a7dbc7fdc46c0a7a5b47d7d416fabd5d932e771c"} Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.204478 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.388841 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-ssh-key-openstack-edpm-ipam\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.388896 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-1\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.388920 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-inventory\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.388944 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-2\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.388983 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-1\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.389004 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h48gx\" (UniqueName: \"kubernetes.io/projected/43b7e07c-895e-46e1-9863-4dc4845a72ea-kube-api-access-h48gx\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.389045 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-extra-config-0\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.389089 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-combined-ca-bundle\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.389144 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-0\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.389195 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-3\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.389228 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-0\") pod \"43b7e07c-895e-46e1-9863-4dc4845a72ea\" (UID: \"43b7e07c-895e-46e1-9863-4dc4845a72ea\") " Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.418066 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b7e07c-895e-46e1-9863-4dc4845a72ea-kube-api-access-h48gx" (OuterVolumeSpecName: "kube-api-access-h48gx") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "kube-api-access-h48gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.420975 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.445511 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.454270 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.455113 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.455345 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-inventory" (OuterVolumeSpecName: "inventory") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.459452 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.461670 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.461790 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.477672 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.479021 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "43b7e07c-895e-46e1-9863-4dc4845a72ea" (UID: "43b7e07c-895e-46e1-9863-4dc4845a72ea"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491580 4911 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491641 4911 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491656 4911 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491669 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491682 4911 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491696 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491709 4911 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491743 4911 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491756 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h48gx\" (UniqueName: \"kubernetes.io/projected/43b7e07c-895e-46e1-9863-4dc4845a72ea-kube-api-access-h48gx\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491767 4911 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.491778 4911 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b7e07c-895e-46e1-9863-4dc4845a72ea-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.806471 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" event={"ID":"43b7e07c-895e-46e1-9863-4dc4845a72ea","Type":"ContainerDied","Data":"2ead0d7aca353b5f804366afa0218774c42ae2a51e3e4f367dfa0c794f5b2b74"} Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.806560 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ead0d7aca353b5f804366afa0218774c42ae2a51e3e4f367dfa0c794f5b2b74" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.806505 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dksjc" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.922288 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6"] Mar 10 14:45:58 crc kubenswrapper[4911]: E0310 14:45:58.927520 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e" containerName="collect-profiles" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.927571 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e" containerName="collect-profiles" Mar 10 14:45:58 crc kubenswrapper[4911]: E0310 14:45:58.927603 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b7e07c-895e-46e1-9863-4dc4845a72ea" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.927613 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b7e07c-895e-46e1-9863-4dc4845a72ea" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.927924 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b7e07c-895e-46e1-9863-4dc4845a72ea" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.927967 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60c5bf1-8e4c-444d-9227-bc7b1e6e6f6e" containerName="collect-profiles" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.928999 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.931791 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.935958 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.936655 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.936885 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xllqc" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.947130 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 14:45:58 crc kubenswrapper[4911]: I0310 14:45:58.956886 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6"] Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.102742 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.103121 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.103226 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l64zm\" (UniqueName: \"kubernetes.io/projected/1fe4191c-9c8e-4d7c-9323-0fce2c397878-kube-api-access-l64zm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.103307 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.103390 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.103427 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.103638 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.205733 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.206221 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.206253 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.206296 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l64zm\" (UniqueName: \"kubernetes.io/projected/1fe4191c-9c8e-4d7c-9323-0fce2c397878-kube-api-access-l64zm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.206345 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.206385 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.206420 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.214056 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.214082 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.215907 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.216176 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.216230 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.217134 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.225367 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l64zm\" (UniqueName: \"kubernetes.io/projected/1fe4191c-9c8e-4d7c-9323-0fce2c397878-kube-api-access-l64zm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:45:59 crc kubenswrapper[4911]: I0310 14:45:59.256155 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:45:59.772449 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6"] Mar 10 14:46:00 crc kubenswrapper[4911]: W0310 14:45:59.775812 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe4191c_9c8e_4d7c_9323_0fce2c397878.slice/crio-24f83c539876b7cb73d1780ce918c4f0681cce4095f5a02182d07079b1a4ce03 WatchSource:0}: Error finding container 24f83c539876b7cb73d1780ce918c4f0681cce4095f5a02182d07079b1a4ce03: Status 404 returned error can't find the container with id 24f83c539876b7cb73d1780ce918c4f0681cce4095f5a02182d07079b1a4ce03 Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:45:59.821534 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" event={"ID":"1fe4191c-9c8e-4d7c-9323-0fce2c397878","Type":"ContainerStarted","Data":"24f83c539876b7cb73d1780ce918c4f0681cce4095f5a02182d07079b1a4ce03"} Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.156153 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552566-c2rz2"] Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.158850 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552566-c2rz2" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.161182 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.161337 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.161385 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.170932 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552566-c2rz2"] Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.347321 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2tx2\" (UniqueName: \"kubernetes.io/projected/97a0ac3c-c2e3-41d9-9da5-24745865c184-kube-api-access-p2tx2\") pod \"auto-csr-approver-29552566-c2rz2\" (UID: \"97a0ac3c-c2e3-41d9-9da5-24745865c184\") " pod="openshift-infra/auto-csr-approver-29552566-c2rz2" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.449229 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2tx2\" (UniqueName: \"kubernetes.io/projected/97a0ac3c-c2e3-41d9-9da5-24745865c184-kube-api-access-p2tx2\") pod \"auto-csr-approver-29552566-c2rz2\" (UID: \"97a0ac3c-c2e3-41d9-9da5-24745865c184\") " pod="openshift-infra/auto-csr-approver-29552566-c2rz2" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.467074 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2tx2\" (UniqueName: \"kubernetes.io/projected/97a0ac3c-c2e3-41d9-9da5-24745865c184-kube-api-access-p2tx2\") pod \"auto-csr-approver-29552566-c2rz2\" (UID: \"97a0ac3c-c2e3-41d9-9da5-24745865c184\") " pod="openshift-infra/auto-csr-approver-29552566-c2rz2" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.492562 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552566-c2rz2" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.832162 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" event={"ID":"1fe4191c-9c8e-4d7c-9323-0fce2c397878","Type":"ContainerStarted","Data":"fc71bbf34decb88e057dd3b6b05a369379e74fd21fa2f9e3bb9139496da44f6f"} Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.957663 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" podStartSLOduration=2.775819181 podStartE2EDuration="2.957631676s" podCreationTimestamp="2026-03-10 14:45:58 +0000 UTC" firstStartedPulling="2026-03-10 14:45:59.778975184 +0000 UTC m=+2664.342495101" lastFinishedPulling="2026-03-10 14:45:59.960787679 +0000 UTC m=+2664.524307596" observedRunningTime="2026-03-10 14:46:00.855846892 +0000 UTC m=+2665.419366799" watchObservedRunningTime="2026-03-10 14:46:00.957631676 +0000 UTC m=+2665.521151593" Mar 10 14:46:00 crc kubenswrapper[4911]: I0310 14:46:00.965352 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552566-c2rz2"] Mar 10 14:46:00 crc kubenswrapper[4911]: W0310 14:46:00.969314 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97a0ac3c_c2e3_41d9_9da5_24745865c184.slice/crio-7194a8e12ec2a72156beaeb6e29be959d81b982033f879eb17977c919bcac2a0 WatchSource:0}: Error finding container 7194a8e12ec2a72156beaeb6e29be959d81b982033f879eb17977c919bcac2a0: Status 404 returned error can't find the container with id 7194a8e12ec2a72156beaeb6e29be959d81b982033f879eb17977c919bcac2a0 Mar 10 14:46:01 crc kubenswrapper[4911]: I0310 14:46:01.845232 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552566-c2rz2" event={"ID":"97a0ac3c-c2e3-41d9-9da5-24745865c184","Type":"ContainerStarted","Data":"7194a8e12ec2a72156beaeb6e29be959d81b982033f879eb17977c919bcac2a0"} Mar 10 14:46:02 crc kubenswrapper[4911]: I0310 14:46:02.857353 4911 generic.go:334] "Generic (PLEG): container finished" podID="97a0ac3c-c2e3-41d9-9da5-24745865c184" containerID="77425a8168accf4bb764468f4730f411992b0679c72425ac86375a77239f9719" exitCode=0 Mar 10 14:46:02 crc kubenswrapper[4911]: I0310 14:46:02.857639 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552566-c2rz2" event={"ID":"97a0ac3c-c2e3-41d9-9da5-24745865c184","Type":"ContainerDied","Data":"77425a8168accf4bb764468f4730f411992b0679c72425ac86375a77239f9719"} Mar 10 14:46:04 crc kubenswrapper[4911]: I0310 14:46:04.197792 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552566-c2rz2" Mar 10 14:46:04 crc kubenswrapper[4911]: I0310 14:46:04.337751 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2tx2\" (UniqueName: \"kubernetes.io/projected/97a0ac3c-c2e3-41d9-9da5-24745865c184-kube-api-access-p2tx2\") pod \"97a0ac3c-c2e3-41d9-9da5-24745865c184\" (UID: \"97a0ac3c-c2e3-41d9-9da5-24745865c184\") " Mar 10 14:46:04 crc kubenswrapper[4911]: I0310 14:46:04.344982 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a0ac3c-c2e3-41d9-9da5-24745865c184-kube-api-access-p2tx2" (OuterVolumeSpecName: "kube-api-access-p2tx2") pod "97a0ac3c-c2e3-41d9-9da5-24745865c184" (UID: "97a0ac3c-c2e3-41d9-9da5-24745865c184"). InnerVolumeSpecName "kube-api-access-p2tx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:46:04 crc kubenswrapper[4911]: I0310 14:46:04.442027 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2tx2\" (UniqueName: \"kubernetes.io/projected/97a0ac3c-c2e3-41d9-9da5-24745865c184-kube-api-access-p2tx2\") on node \"crc\" DevicePath \"\"" Mar 10 14:46:04 crc kubenswrapper[4911]: I0310 14:46:04.878494 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552566-c2rz2" event={"ID":"97a0ac3c-c2e3-41d9-9da5-24745865c184","Type":"ContainerDied","Data":"7194a8e12ec2a72156beaeb6e29be959d81b982033f879eb17977c919bcac2a0"} Mar 10 14:46:04 crc kubenswrapper[4911]: I0310 14:46:04.878547 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7194a8e12ec2a72156beaeb6e29be959d81b982033f879eb17977c919bcac2a0" Mar 10 14:46:04 crc kubenswrapper[4911]: I0310 14:46:04.878578 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552566-c2rz2" Mar 10 14:46:05 crc kubenswrapper[4911]: I0310 14:46:05.282178 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552560-9k5z5"] Mar 10 14:46:05 crc kubenswrapper[4911]: I0310 14:46:05.292095 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552560-9k5z5"] Mar 10 14:46:06 crc kubenswrapper[4911]: I0310 14:46:06.219371 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baff6eff-7b06-496a-843b-995a53744848" path="/var/lib/kubelet/pods/baff6eff-7b06-496a-843b-995a53744848/volumes" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.177668 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tgkms"] Mar 10 14:46:40 crc kubenswrapper[4911]: E0310 14:46:40.178871 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a0ac3c-c2e3-41d9-9da5-24745865c184" containerName="oc" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.178891 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a0ac3c-c2e3-41d9-9da5-24745865c184" containerName="oc" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.179154 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a0ac3c-c2e3-41d9-9da5-24745865c184" containerName="oc" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.181047 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.216506 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tgkms"] Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.275971 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-utilities\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.276079 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-catalog-content\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.276249 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x85qs\" (UniqueName: \"kubernetes.io/projected/78b21cda-15f2-4872-b71d-5f45e1678ece-kube-api-access-x85qs\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.378562 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-utilities\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.378639 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-catalog-content\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.378743 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x85qs\" (UniqueName: \"kubernetes.io/projected/78b21cda-15f2-4872-b71d-5f45e1678ece-kube-api-access-x85qs\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.379282 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-utilities\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.379321 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-catalog-content\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.401233 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x85qs\" (UniqueName: \"kubernetes.io/projected/78b21cda-15f2-4872-b71d-5f45e1678ece-kube-api-access-x85qs\") pod \"redhat-operators-tgkms\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:40 crc kubenswrapper[4911]: I0310 14:46:40.513941 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:41 crc kubenswrapper[4911]: I0310 14:46:41.012260 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tgkms"] Mar 10 14:46:41 crc kubenswrapper[4911]: I0310 14:46:41.293957 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgkms" event={"ID":"78b21cda-15f2-4872-b71d-5f45e1678ece","Type":"ContainerStarted","Data":"9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7"} Mar 10 14:46:41 crc kubenswrapper[4911]: I0310 14:46:41.294013 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgkms" event={"ID":"78b21cda-15f2-4872-b71d-5f45e1678ece","Type":"ContainerStarted","Data":"fd518ac685ce5b1d6b6eb9acbe348e30c28061711d1214fbdb616118502ec0cd"} Mar 10 14:46:42 crc kubenswrapper[4911]: I0310 14:46:42.306255 4911 generic.go:334] "Generic (PLEG): container finished" podID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerID="9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7" exitCode=0 Mar 10 14:46:42 crc kubenswrapper[4911]: I0310 14:46:42.306354 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgkms" event={"ID":"78b21cda-15f2-4872-b71d-5f45e1678ece","Type":"ContainerDied","Data":"9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7"} Mar 10 14:46:44 crc kubenswrapper[4911]: I0310 14:46:44.329696 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgkms" event={"ID":"78b21cda-15f2-4872-b71d-5f45e1678ece","Type":"ContainerStarted","Data":"f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5"} Mar 10 14:46:45 crc kubenswrapper[4911]: I0310 14:46:45.339822 4911 generic.go:334] "Generic (PLEG): container finished" podID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerID="f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5" exitCode=0 Mar 10 14:46:45 crc kubenswrapper[4911]: I0310 14:46:45.339873 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgkms" event={"ID":"78b21cda-15f2-4872-b71d-5f45e1678ece","Type":"ContainerDied","Data":"f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5"} Mar 10 14:46:46 crc kubenswrapper[4911]: I0310 14:46:46.354619 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgkms" event={"ID":"78b21cda-15f2-4872-b71d-5f45e1678ece","Type":"ContainerStarted","Data":"d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb"} Mar 10 14:46:46 crc kubenswrapper[4911]: I0310 14:46:46.384016 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tgkms" podStartSLOduration=2.939378859 podStartE2EDuration="6.383985481s" podCreationTimestamp="2026-03-10 14:46:40 +0000 UTC" firstStartedPulling="2026-03-10 14:46:42.308748772 +0000 UTC m=+2706.872268689" lastFinishedPulling="2026-03-10 14:46:45.753355394 +0000 UTC m=+2710.316875311" observedRunningTime="2026-03-10 14:46:46.374141137 +0000 UTC m=+2710.937661074" watchObservedRunningTime="2026-03-10 14:46:46.383985481 +0000 UTC m=+2710.947505408" Mar 10 14:46:50 crc kubenswrapper[4911]: I0310 14:46:50.514851 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:50 crc kubenswrapper[4911]: I0310 14:46:50.515351 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:46:51 crc kubenswrapper[4911]: I0310 14:46:51.559643 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tgkms" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="registry-server" probeResult="failure" output=< Mar 10 14:46:51 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 14:46:51 crc kubenswrapper[4911]: > Mar 10 14:46:56 crc kubenswrapper[4911]: I0310 14:46:56.484770 4911 scope.go:117] "RemoveContainer" containerID="28183eeb0abe3123897d6e9a36ab8e6f288cc23ebc498379a315b1c055d99009" Mar 10 14:46:56 crc kubenswrapper[4911]: I0310 14:46:56.511431 4911 scope.go:117] "RemoveContainer" containerID="07e5192a487303d893800527b670d88be39fc9185bf2d0c5283021fd4d6772af" Mar 10 14:46:56 crc kubenswrapper[4911]: I0310 14:46:56.538765 4911 scope.go:117] "RemoveContainer" containerID="31571360feff5618be2e17cd09d30a0aa4c6a2438b9a0f3e30a4164e554768d0" Mar 10 14:46:56 crc kubenswrapper[4911]: I0310 14:46:56.583062 4911 scope.go:117] "RemoveContainer" containerID="63debf0da99332146fa4eb94fb27bd691fd0cd020c5f4402a63139473195c69f" Mar 10 14:47:00 crc kubenswrapper[4911]: I0310 14:47:00.566202 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:47:00 crc kubenswrapper[4911]: I0310 14:47:00.616993 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:47:00 crc kubenswrapper[4911]: I0310 14:47:00.811431 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tgkms"] Mar 10 14:47:02 crc kubenswrapper[4911]: I0310 14:47:02.514322 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tgkms" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="registry-server" containerID="cri-o://d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb" gracePeriod=2 Mar 10 14:47:02 crc kubenswrapper[4911]: I0310 14:47:02.978619 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.129416 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x85qs\" (UniqueName: \"kubernetes.io/projected/78b21cda-15f2-4872-b71d-5f45e1678ece-kube-api-access-x85qs\") pod \"78b21cda-15f2-4872-b71d-5f45e1678ece\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.129463 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-catalog-content\") pod \"78b21cda-15f2-4872-b71d-5f45e1678ece\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.129687 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-utilities\") pod \"78b21cda-15f2-4872-b71d-5f45e1678ece\" (UID: \"78b21cda-15f2-4872-b71d-5f45e1678ece\") " Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.130977 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-utilities" (OuterVolumeSpecName: "utilities") pod "78b21cda-15f2-4872-b71d-5f45e1678ece" (UID: "78b21cda-15f2-4872-b71d-5f45e1678ece"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.136305 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b21cda-15f2-4872-b71d-5f45e1678ece-kube-api-access-x85qs" (OuterVolumeSpecName: "kube-api-access-x85qs") pod "78b21cda-15f2-4872-b71d-5f45e1678ece" (UID: "78b21cda-15f2-4872-b71d-5f45e1678ece"). InnerVolumeSpecName "kube-api-access-x85qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.252596 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.252642 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x85qs\" (UniqueName: \"kubernetes.io/projected/78b21cda-15f2-4872-b71d-5f45e1678ece-kube-api-access-x85qs\") on node \"crc\" DevicePath \"\"" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.283909 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78b21cda-15f2-4872-b71d-5f45e1678ece" (UID: "78b21cda-15f2-4872-b71d-5f45e1678ece"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.354626 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b21cda-15f2-4872-b71d-5f45e1678ece-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.530663 4911 generic.go:334] "Generic (PLEG): container finished" podID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerID="d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb" exitCode=0 Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.530959 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgkms" event={"ID":"78b21cda-15f2-4872-b71d-5f45e1678ece","Type":"ContainerDied","Data":"d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb"} Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.532328 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgkms" event={"ID":"78b21cda-15f2-4872-b71d-5f45e1678ece","Type":"ContainerDied","Data":"fd518ac685ce5b1d6b6eb9acbe348e30c28061711d1214fbdb616118502ec0cd"} Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.530994 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgkms" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.532378 4911 scope.go:117] "RemoveContainer" containerID="d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.569038 4911 scope.go:117] "RemoveContainer" containerID="f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.580863 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tgkms"] Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.591380 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tgkms"] Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.597249 4911 scope.go:117] "RemoveContainer" containerID="9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.649137 4911 scope.go:117] "RemoveContainer" containerID="d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb" Mar 10 14:47:03 crc kubenswrapper[4911]: E0310 14:47:03.649599 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb\": container with ID starting with d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb not found: ID does not exist" containerID="d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.649642 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb"} err="failed to get container status \"d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb\": rpc error: code = NotFound desc = could not find container \"d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb\": container with ID starting with d042195ba7b6583488a4d59df08462fbfd253dc54950fc4133b2a48510cda2eb not found: ID does not exist" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.649675 4911 scope.go:117] "RemoveContainer" containerID="f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5" Mar 10 14:47:03 crc kubenswrapper[4911]: E0310 14:47:03.650298 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5\": container with ID starting with f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5 not found: ID does not exist" containerID="f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.650426 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5"} err="failed to get container status \"f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5\": rpc error: code = NotFound desc = could not find container \"f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5\": container with ID starting with f1cb7dde0c557f662ce3fa93e4090d133c76847bb2f5da9df556f94655ded2f5 not found: ID does not exist" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.650546 4911 scope.go:117] "RemoveContainer" containerID="9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7" Mar 10 14:47:03 crc kubenswrapper[4911]: E0310 14:47:03.650985 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7\": container with ID starting with 9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7 not found: ID does not exist" containerID="9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7" Mar 10 14:47:03 crc kubenswrapper[4911]: I0310 14:47:03.651069 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7"} err="failed to get container status \"9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7\": rpc error: code = NotFound desc = could not find container \"9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7\": container with ID starting with 9bd39a60b15c6b272ffe24e750e80f0136472348c32f7b0fa0158642e02ebbf7 not found: ID does not exist" Mar 10 14:47:04 crc kubenswrapper[4911]: I0310 14:47:04.204132 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" path="/var/lib/kubelet/pods/78b21cda-15f2-4872-b71d-5f45e1678ece/volumes" Mar 10 14:47:18 crc kubenswrapper[4911]: I0310 14:47:18.521366 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:47:18 crc kubenswrapper[4911]: I0310 14:47:18.521960 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.666814 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hpljz"] Mar 10 14:47:22 crc kubenswrapper[4911]: E0310 14:47:22.670921 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="extract-content" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.670961 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="extract-content" Mar 10 14:47:22 crc kubenswrapper[4911]: E0310 14:47:22.670991 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="registry-server" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.671001 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="registry-server" Mar 10 14:47:22 crc kubenswrapper[4911]: E0310 14:47:22.671483 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="extract-utilities" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.671497 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="extract-utilities" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.671961 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b21cda-15f2-4872-b71d-5f45e1678ece" containerName="registry-server" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.673860 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.690109 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpljz"] Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.777241 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-utilities\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.777323 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-catalog-content\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.777388 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xwmj\" (UniqueName: \"kubernetes.io/projected/4961fd24-97ee-4ee5-b643-07ea551d6d81-kube-api-access-4xwmj\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.879181 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-utilities\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.879351 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-catalog-content\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.879399 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xwmj\" (UniqueName: \"kubernetes.io/projected/4961fd24-97ee-4ee5-b643-07ea551d6d81-kube-api-access-4xwmj\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.879903 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-utilities\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.880207 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-catalog-content\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:22 crc kubenswrapper[4911]: I0310 14:47:22.900400 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xwmj\" (UniqueName: \"kubernetes.io/projected/4961fd24-97ee-4ee5-b643-07ea551d6d81-kube-api-access-4xwmj\") pod \"redhat-marketplace-hpljz\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:23 crc kubenswrapper[4911]: I0310 14:47:23.003862 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:23 crc kubenswrapper[4911]: I0310 14:47:23.570030 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpljz"] Mar 10 14:47:23 crc kubenswrapper[4911]: I0310 14:47:23.719481 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpljz" event={"ID":"4961fd24-97ee-4ee5-b643-07ea551d6d81","Type":"ContainerStarted","Data":"0750aabcd57af06c48c10435637b3c521bda0c6ad02540b552de52d9e7cc1f13"} Mar 10 14:47:24 crc kubenswrapper[4911]: I0310 14:47:24.740495 4911 generic.go:334] "Generic (PLEG): container finished" podID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerID="1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9" exitCode=0 Mar 10 14:47:24 crc kubenswrapper[4911]: I0310 14:47:24.740606 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpljz" event={"ID":"4961fd24-97ee-4ee5-b643-07ea551d6d81","Type":"ContainerDied","Data":"1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9"} Mar 10 14:47:25 crc kubenswrapper[4911]: I0310 14:47:25.754189 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpljz" event={"ID":"4961fd24-97ee-4ee5-b643-07ea551d6d81","Type":"ContainerStarted","Data":"3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270"} Mar 10 14:47:26 crc kubenswrapper[4911]: I0310 14:47:26.767472 4911 generic.go:334] "Generic (PLEG): container finished" podID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerID="3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270" exitCode=0 Mar 10 14:47:26 crc kubenswrapper[4911]: I0310 14:47:26.767589 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpljz" event={"ID":"4961fd24-97ee-4ee5-b643-07ea551d6d81","Type":"ContainerDied","Data":"3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270"} Mar 10 14:47:27 crc kubenswrapper[4911]: I0310 14:47:27.792457 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpljz" event={"ID":"4961fd24-97ee-4ee5-b643-07ea551d6d81","Type":"ContainerStarted","Data":"2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d"} Mar 10 14:47:27 crc kubenswrapper[4911]: I0310 14:47:27.816125 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hpljz" podStartSLOduration=3.386036032 podStartE2EDuration="5.816100094s" podCreationTimestamp="2026-03-10 14:47:22 +0000 UTC" firstStartedPulling="2026-03-10 14:47:24.748186724 +0000 UTC m=+2749.311706641" lastFinishedPulling="2026-03-10 14:47:27.178250786 +0000 UTC m=+2751.741770703" observedRunningTime="2026-03-10 14:47:27.813421503 +0000 UTC m=+2752.376941430" watchObservedRunningTime="2026-03-10 14:47:27.816100094 +0000 UTC m=+2752.379620001" Mar 10 14:47:33 crc kubenswrapper[4911]: I0310 14:47:33.004980 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:33 crc kubenswrapper[4911]: I0310 14:47:33.005580 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:33 crc kubenswrapper[4911]: I0310 14:47:33.089172 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:33 crc kubenswrapper[4911]: I0310 14:47:33.906387 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:33 crc kubenswrapper[4911]: I0310 14:47:33.970743 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpljz"] Mar 10 14:47:35 crc kubenswrapper[4911]: I0310 14:47:35.872939 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hpljz" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerName="registry-server" containerID="cri-o://2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d" gracePeriod=2 Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.379782 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.572452 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-catalog-content\") pod \"4961fd24-97ee-4ee5-b643-07ea551d6d81\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.572609 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-utilities\") pod \"4961fd24-97ee-4ee5-b643-07ea551d6d81\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.572749 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xwmj\" (UniqueName: \"kubernetes.io/projected/4961fd24-97ee-4ee5-b643-07ea551d6d81-kube-api-access-4xwmj\") pod \"4961fd24-97ee-4ee5-b643-07ea551d6d81\" (UID: \"4961fd24-97ee-4ee5-b643-07ea551d6d81\") " Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.574046 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-utilities" (OuterVolumeSpecName: "utilities") pod "4961fd24-97ee-4ee5-b643-07ea551d6d81" (UID: "4961fd24-97ee-4ee5-b643-07ea551d6d81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.580102 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4961fd24-97ee-4ee5-b643-07ea551d6d81-kube-api-access-4xwmj" (OuterVolumeSpecName: "kube-api-access-4xwmj") pod "4961fd24-97ee-4ee5-b643-07ea551d6d81" (UID: "4961fd24-97ee-4ee5-b643-07ea551d6d81"). InnerVolumeSpecName "kube-api-access-4xwmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.609317 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4961fd24-97ee-4ee5-b643-07ea551d6d81" (UID: "4961fd24-97ee-4ee5-b643-07ea551d6d81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.675675 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xwmj\" (UniqueName: \"kubernetes.io/projected/4961fd24-97ee-4ee5-b643-07ea551d6d81-kube-api-access-4xwmj\") on node \"crc\" DevicePath \"\"" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.675765 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.675780 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4961fd24-97ee-4ee5-b643-07ea551d6d81-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.884775 4911 generic.go:334] "Generic (PLEG): container finished" podID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerID="2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d" exitCode=0 Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.884824 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpljz" event={"ID":"4961fd24-97ee-4ee5-b643-07ea551d6d81","Type":"ContainerDied","Data":"2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d"} Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.884879 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpljz" event={"ID":"4961fd24-97ee-4ee5-b643-07ea551d6d81","Type":"ContainerDied","Data":"0750aabcd57af06c48c10435637b3c521bda0c6ad02540b552de52d9e7cc1f13"} Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.884902 4911 scope.go:117] "RemoveContainer" containerID="2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.884968 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpljz" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.908615 4911 scope.go:117] "RemoveContainer" containerID="3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270" Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.922807 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpljz"] Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.930459 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpljz"] Mar 10 14:47:36 crc kubenswrapper[4911]: I0310 14:47:36.962591 4911 scope.go:117] "RemoveContainer" containerID="1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9" Mar 10 14:47:37 crc kubenswrapper[4911]: I0310 14:47:37.011596 4911 scope.go:117] "RemoveContainer" containerID="2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d" Mar 10 14:47:37 crc kubenswrapper[4911]: E0310 14:47:37.012294 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d\": container with ID starting with 2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d not found: ID does not exist" containerID="2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d" Mar 10 14:47:37 crc kubenswrapper[4911]: I0310 14:47:37.012364 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d"} err="failed to get container status \"2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d\": rpc error: code = NotFound desc = could not find container \"2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d\": container with ID starting with 2e1260cea0126da0e194164a8d0f0f8f61ecb401ed7b0be0920648e616ded36d not found: ID does not exist" Mar 10 14:47:37 crc kubenswrapper[4911]: I0310 14:47:37.012401 4911 scope.go:117] "RemoveContainer" containerID="3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270" Mar 10 14:47:37 crc kubenswrapper[4911]: E0310 14:47:37.013039 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270\": container with ID starting with 3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270 not found: ID does not exist" containerID="3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270" Mar 10 14:47:37 crc kubenswrapper[4911]: I0310 14:47:37.013080 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270"} err="failed to get container status \"3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270\": rpc error: code = NotFound desc = could not find container \"3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270\": container with ID starting with 3138344874cfa1c911cce86325758538d284907d7ba4af09e738a6aaf63b7270 not found: ID does not exist" Mar 10 14:47:37 crc kubenswrapper[4911]: I0310 14:47:37.013104 4911 scope.go:117] "RemoveContainer" containerID="1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9" Mar 10 14:47:37 crc kubenswrapper[4911]: E0310 14:47:37.013639 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9\": container with ID starting with 1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9 not found: ID does not exist" containerID="1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9" Mar 10 14:47:37 crc kubenswrapper[4911]: I0310 14:47:37.013678 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9"} err="failed to get container status \"1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9\": rpc error: code = NotFound desc = could not find container \"1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9\": container with ID starting with 1ff1caf4c8f548a470f0103f147b96340d5f0ab58c0371b86f8cfc0697c2bfa9 not found: ID does not exist" Mar 10 14:47:38 crc kubenswrapper[4911]: I0310 14:47:38.215442 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" path="/var/lib/kubelet/pods/4961fd24-97ee-4ee5-b643-07ea551d6d81/volumes" Mar 10 14:47:48 crc kubenswrapper[4911]: I0310 14:47:48.520893 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:47:48 crc kubenswrapper[4911]: I0310 14:47:48.521807 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.149461 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552568-z8bl8"] Mar 10 14:48:00 crc kubenswrapper[4911]: E0310 14:48:00.151144 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerName="extract-content" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.151159 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerName="extract-content" Mar 10 14:48:00 crc kubenswrapper[4911]: E0310 14:48:00.151191 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerName="registry-server" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.151198 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerName="registry-server" Mar 10 14:48:00 crc kubenswrapper[4911]: E0310 14:48:00.151217 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerName="extract-utilities" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.151224 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerName="extract-utilities" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.151409 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4961fd24-97ee-4ee5-b643-07ea551d6d81" containerName="registry-server" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.152204 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.158450 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.159250 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.159648 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.164510 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552568-z8bl8"] Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.294888 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cbsb\" (UniqueName: \"kubernetes.io/projected/1e937b10-d5be-4e52-a7dc-fc92ab37a169-kube-api-access-4cbsb\") pod \"auto-csr-approver-29552568-z8bl8\" (UID: \"1e937b10-d5be-4e52-a7dc-fc92ab37a169\") " pod="openshift-infra/auto-csr-approver-29552568-z8bl8" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.397558 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cbsb\" (UniqueName: \"kubernetes.io/projected/1e937b10-d5be-4e52-a7dc-fc92ab37a169-kube-api-access-4cbsb\") pod \"auto-csr-approver-29552568-z8bl8\" (UID: \"1e937b10-d5be-4e52-a7dc-fc92ab37a169\") " pod="openshift-infra/auto-csr-approver-29552568-z8bl8" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.420047 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cbsb\" (UniqueName: \"kubernetes.io/projected/1e937b10-d5be-4e52-a7dc-fc92ab37a169-kube-api-access-4cbsb\") pod \"auto-csr-approver-29552568-z8bl8\" (UID: \"1e937b10-d5be-4e52-a7dc-fc92ab37a169\") " pod="openshift-infra/auto-csr-approver-29552568-z8bl8" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.476869 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" Mar 10 14:48:00 crc kubenswrapper[4911]: I0310 14:48:00.953702 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552568-z8bl8"] Mar 10 14:48:01 crc kubenswrapper[4911]: I0310 14:48:01.130460 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" event={"ID":"1e937b10-d5be-4e52-a7dc-fc92ab37a169","Type":"ContainerStarted","Data":"6970527d8112c006e9fad05400f18c302885ada5d97d1ba89658d49aa02945c2"} Mar 10 14:48:02 crc kubenswrapper[4911]: I0310 14:48:02.151319 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" event={"ID":"1e937b10-d5be-4e52-a7dc-fc92ab37a169","Type":"ContainerStarted","Data":"b7215ae128e520b3805668a323673602781c86132e79c5ab649de8e3f2b53377"} Mar 10 14:48:02 crc kubenswrapper[4911]: I0310 14:48:02.182863 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" podStartSLOduration=1.385601225 podStartE2EDuration="2.182838991s" podCreationTimestamp="2026-03-10 14:48:00 +0000 UTC" firstStartedPulling="2026-03-10 14:48:00.949010492 +0000 UTC m=+2785.512530409" lastFinishedPulling="2026-03-10 14:48:01.746248258 +0000 UTC m=+2786.309768175" observedRunningTime="2026-03-10 14:48:02.16896254 +0000 UTC m=+2786.732482457" watchObservedRunningTime="2026-03-10 14:48:02.182838991 +0000 UTC m=+2786.746358918" Mar 10 14:48:03 crc kubenswrapper[4911]: I0310 14:48:03.162173 4911 generic.go:334] "Generic (PLEG): container finished" podID="1e937b10-d5be-4e52-a7dc-fc92ab37a169" containerID="b7215ae128e520b3805668a323673602781c86132e79c5ab649de8e3f2b53377" exitCode=0 Mar 10 14:48:03 crc kubenswrapper[4911]: I0310 14:48:03.162277 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" event={"ID":"1e937b10-d5be-4e52-a7dc-fc92ab37a169","Type":"ContainerDied","Data":"b7215ae128e520b3805668a323673602781c86132e79c5ab649de8e3f2b53377"} Mar 10 14:48:04 crc kubenswrapper[4911]: I0310 14:48:04.545202 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" Mar 10 14:48:04 crc kubenswrapper[4911]: I0310 14:48:04.600994 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cbsb\" (UniqueName: \"kubernetes.io/projected/1e937b10-d5be-4e52-a7dc-fc92ab37a169-kube-api-access-4cbsb\") pod \"1e937b10-d5be-4e52-a7dc-fc92ab37a169\" (UID: \"1e937b10-d5be-4e52-a7dc-fc92ab37a169\") " Mar 10 14:48:04 crc kubenswrapper[4911]: I0310 14:48:04.608337 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e937b10-d5be-4e52-a7dc-fc92ab37a169-kube-api-access-4cbsb" (OuterVolumeSpecName: "kube-api-access-4cbsb") pod "1e937b10-d5be-4e52-a7dc-fc92ab37a169" (UID: "1e937b10-d5be-4e52-a7dc-fc92ab37a169"). InnerVolumeSpecName "kube-api-access-4cbsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:48:04 crc kubenswrapper[4911]: I0310 14:48:04.703628 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cbsb\" (UniqueName: \"kubernetes.io/projected/1e937b10-d5be-4e52-a7dc-fc92ab37a169-kube-api-access-4cbsb\") on node \"crc\" DevicePath \"\"" Mar 10 14:48:05 crc kubenswrapper[4911]: I0310 14:48:05.184878 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" event={"ID":"1e937b10-d5be-4e52-a7dc-fc92ab37a169","Type":"ContainerDied","Data":"6970527d8112c006e9fad05400f18c302885ada5d97d1ba89658d49aa02945c2"} Mar 10 14:48:05 crc kubenswrapper[4911]: I0310 14:48:05.185618 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6970527d8112c006e9fad05400f18c302885ada5d97d1ba89658d49aa02945c2" Mar 10 14:48:05 crc kubenswrapper[4911]: I0310 14:48:05.185162 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552568-z8bl8" Mar 10 14:48:05 crc kubenswrapper[4911]: I0310 14:48:05.236964 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552562-xsfhx"] Mar 10 14:48:05 crc kubenswrapper[4911]: I0310 14:48:05.250063 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552562-xsfhx"] Mar 10 14:48:06 crc kubenswrapper[4911]: I0310 14:48:06.212463 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ed8fcd-f05b-488a-ac3e-4becbe638683" path="/var/lib/kubelet/pods/15ed8fcd-f05b-488a-ac3e-4becbe638683/volumes" Mar 10 14:48:13 crc kubenswrapper[4911]: I0310 14:48:13.263036 4911 generic.go:334] "Generic (PLEG): container finished" podID="1fe4191c-9c8e-4d7c-9323-0fce2c397878" containerID="fc71bbf34decb88e057dd3b6b05a369379e74fd21fa2f9e3bb9139496da44f6f" exitCode=0 Mar 10 14:48:13 crc kubenswrapper[4911]: I0310 14:48:13.263134 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" event={"ID":"1fe4191c-9c8e-4d7c-9323-0fce2c397878","Type":"ContainerDied","Data":"fc71bbf34decb88e057dd3b6b05a369379e74fd21fa2f9e3bb9139496da44f6f"} Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.738517 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.923956 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ssh-key-openstack-edpm-ipam\") pod \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.924091 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-2\") pod \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.924126 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-telemetry-combined-ca-bundle\") pod \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.924200 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-inventory\") pod \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.924374 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-0\") pod \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.924494 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l64zm\" (UniqueName: \"kubernetes.io/projected/1fe4191c-9c8e-4d7c-9323-0fce2c397878-kube-api-access-l64zm\") pod \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.924561 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-1\") pod \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\" (UID: \"1fe4191c-9c8e-4d7c-9323-0fce2c397878\") " Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.931683 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe4191c-9c8e-4d7c-9323-0fce2c397878-kube-api-access-l64zm" (OuterVolumeSpecName: "kube-api-access-l64zm") pod "1fe4191c-9c8e-4d7c-9323-0fce2c397878" (UID: "1fe4191c-9c8e-4d7c-9323-0fce2c397878"). InnerVolumeSpecName "kube-api-access-l64zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.933638 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1fe4191c-9c8e-4d7c-9323-0fce2c397878" (UID: "1fe4191c-9c8e-4d7c-9323-0fce2c397878"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.953555 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1fe4191c-9c8e-4d7c-9323-0fce2c397878" (UID: "1fe4191c-9c8e-4d7c-9323-0fce2c397878"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.955367 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "1fe4191c-9c8e-4d7c-9323-0fce2c397878" (UID: "1fe4191c-9c8e-4d7c-9323-0fce2c397878"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.958958 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "1fe4191c-9c8e-4d7c-9323-0fce2c397878" (UID: "1fe4191c-9c8e-4d7c-9323-0fce2c397878"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.960074 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-inventory" (OuterVolumeSpecName: "inventory") pod "1fe4191c-9c8e-4d7c-9323-0fce2c397878" (UID: "1fe4191c-9c8e-4d7c-9323-0fce2c397878"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:48:14 crc kubenswrapper[4911]: I0310 14:48:14.963559 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "1fe4191c-9c8e-4d7c-9323-0fce2c397878" (UID: "1fe4191c-9c8e-4d7c-9323-0fce2c397878"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.027751 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.027836 4911 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.027852 4911 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.027867 4911 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.027886 4911 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.027899 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l64zm\" (UniqueName: \"kubernetes.io/projected/1fe4191c-9c8e-4d7c-9323-0fce2c397878-kube-api-access-l64zm\") on node \"crc\" DevicePath \"\"" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.027915 4911 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1fe4191c-9c8e-4d7c-9323-0fce2c397878-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.283971 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" event={"ID":"1fe4191c-9c8e-4d7c-9323-0fce2c397878","Type":"ContainerDied","Data":"24f83c539876b7cb73d1780ce918c4f0681cce4095f5a02182d07079b1a4ce03"} Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.284024 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24f83c539876b7cb73d1780ce918c4f0681cce4095f5a02182d07079b1a4ce03" Mar 10 14:48:15 crc kubenswrapper[4911]: I0310 14:48:15.284043 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6" Mar 10 14:48:18 crc kubenswrapper[4911]: I0310 14:48:18.521345 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:48:18 crc kubenswrapper[4911]: I0310 14:48:18.522032 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:48:18 crc kubenswrapper[4911]: I0310 14:48:18.522088 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:48:18 crc kubenswrapper[4911]: I0310 14:48:18.522944 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c92603dfaddc182b1a9ac490086f6b37810dc88d297b219df86ab89393789379"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:48:18 crc kubenswrapper[4911]: I0310 14:48:18.523007 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://c92603dfaddc182b1a9ac490086f6b37810dc88d297b219df86ab89393789379" gracePeriod=600 Mar 10 14:48:19 crc kubenswrapper[4911]: I0310 14:48:19.324999 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="c92603dfaddc182b1a9ac490086f6b37810dc88d297b219df86ab89393789379" exitCode=0 Mar 10 14:48:19 crc kubenswrapper[4911]: I0310 14:48:19.325078 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"c92603dfaddc182b1a9ac490086f6b37810dc88d297b219df86ab89393789379"} Mar 10 14:48:19 crc kubenswrapper[4911]: I0310 14:48:19.325597 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b"} Mar 10 14:48:19 crc kubenswrapper[4911]: I0310 14:48:19.325626 4911 scope.go:117] "RemoveContainer" containerID="d72333a39073c261f9e4cc4deb81958acb94f833f6b6864e3ece31a052a2f233" Mar 10 14:48:56 crc kubenswrapper[4911]: I0310 14:48:56.752463 4911 scope.go:117] "RemoveContainer" containerID="7d2388e7593019c70700d0c2a360b0225d0387c52f7b1c6a0802f3f23fb1fe5c" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.866506 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 14:49:00 crc kubenswrapper[4911]: E0310 14:49:00.871276 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e937b10-d5be-4e52-a7dc-fc92ab37a169" containerName="oc" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.871308 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e937b10-d5be-4e52-a7dc-fc92ab37a169" containerName="oc" Mar 10 14:49:00 crc kubenswrapper[4911]: E0310 14:49:00.871341 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe4191c-9c8e-4d7c-9323-0fce2c397878" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.871352 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4191c-9c8e-4d7c-9323-0fce2c397878" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.871648 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe4191c-9c8e-4d7c-9323-0fce2c397878" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.871692 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e937b10-d5be-4e52-a7dc-fc92ab37a169" containerName="oc" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.872651 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.875583 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.875957 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-k2nwb" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.875997 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.877376 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.902973 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.953673 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-config-data\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.953756 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:00 crc kubenswrapper[4911]: I0310 14:49:00.953910 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059043 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059117 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059154 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9c7v\" (UniqueName: \"kubernetes.io/projected/c6a78318-420c-43fe-98f3-9306e18ee2d4-kube-api-access-c9c7v\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059212 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059234 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059265 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059284 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-config-data\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059308 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.059354 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.060644 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.061763 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-config-data\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.078520 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.165141 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.165681 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.165757 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9c7v\" (UniqueName: \"kubernetes.io/projected/c6a78318-420c-43fe-98f3-9306e18ee2d4-kube-api-access-c9c7v\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.165832 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.165859 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.165906 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.166408 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.166456 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.167740 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.177649 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.178572 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.185612 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9c7v\" (UniqueName: \"kubernetes.io/projected/c6a78318-420c-43fe-98f3-9306e18ee2d4-kube-api-access-c9c7v\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.207168 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.504893 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.960196 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:49:01 crc kubenswrapper[4911]: I0310 14:49:01.969996 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 14:49:02 crc kubenswrapper[4911]: I0310 14:49:02.771996 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c6a78318-420c-43fe-98f3-9306e18ee2d4","Type":"ContainerStarted","Data":"cec02596dce5cb5161a23e8d1e4e02523ba0de43c56ed8f10e5ccdc8baa804cb"} Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.024708 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hnjxx"] Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.028143 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.036097 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnjxx"] Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.154370 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-utilities\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.154469 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x82bf\" (UniqueName: \"kubernetes.io/projected/194679a7-5470-463e-a223-2c77d8e58cef-kube-api-access-x82bf\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.154529 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-catalog-content\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.257255 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-catalog-content\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.257464 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-utilities\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.257498 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x82bf\" (UniqueName: \"kubernetes.io/projected/194679a7-5470-463e-a223-2c77d8e58cef-kube-api-access-x82bf\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.257648 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-catalog-content\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.258021 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-utilities\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.293069 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x82bf\" (UniqueName: \"kubernetes.io/projected/194679a7-5470-463e-a223-2c77d8e58cef-kube-api-access-x82bf\") pod \"community-operators-hnjxx\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:16 crc kubenswrapper[4911]: I0310 14:49:16.379478 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:35 crc kubenswrapper[4911]: E0310 14:49:35.454711 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 10 14:49:35 crc kubenswrapper[4911]: E0310 14:49:35.455631 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c9c7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(c6a78318-420c-43fe-98f3-9306e18ee2d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 14:49:35 crc kubenswrapper[4911]: E0310 14:49:35.459109 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="c6a78318-420c-43fe-98f3-9306e18ee2d4" Mar 10 14:49:35 crc kubenswrapper[4911]: I0310 14:49:35.959579 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnjxx"] Mar 10 14:49:36 crc kubenswrapper[4911]: I0310 14:49:36.361649 4911 generic.go:334] "Generic (PLEG): container finished" podID="194679a7-5470-463e-a223-2c77d8e58cef" containerID="921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399" exitCode=0 Mar 10 14:49:36 crc kubenswrapper[4911]: I0310 14:49:36.361996 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnjxx" event={"ID":"194679a7-5470-463e-a223-2c77d8e58cef","Type":"ContainerDied","Data":"921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399"} Mar 10 14:49:36 crc kubenswrapper[4911]: I0310 14:49:36.362246 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnjxx" event={"ID":"194679a7-5470-463e-a223-2c77d8e58cef","Type":"ContainerStarted","Data":"bfe5fdec1945702fde36fbd617cd38dc7ca551571ad83168eedab03f6648d9ac"} Mar 10 14:49:36 crc kubenswrapper[4911]: E0310 14:49:36.364775 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="c6a78318-420c-43fe-98f3-9306e18ee2d4" Mar 10 14:49:37 crc kubenswrapper[4911]: I0310 14:49:37.377206 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnjxx" event={"ID":"194679a7-5470-463e-a223-2c77d8e58cef","Type":"ContainerStarted","Data":"175b1245d2756d04069e4e28425b549c2e48b010fa68abbaba51723bcf5b2699"} Mar 10 14:49:38 crc kubenswrapper[4911]: I0310 14:49:38.392639 4911 generic.go:334] "Generic (PLEG): container finished" podID="194679a7-5470-463e-a223-2c77d8e58cef" containerID="175b1245d2756d04069e4e28425b549c2e48b010fa68abbaba51723bcf5b2699" exitCode=0 Mar 10 14:49:38 crc kubenswrapper[4911]: I0310 14:49:38.392791 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnjxx" event={"ID":"194679a7-5470-463e-a223-2c77d8e58cef","Type":"ContainerDied","Data":"175b1245d2756d04069e4e28425b549c2e48b010fa68abbaba51723bcf5b2699"} Mar 10 14:49:39 crc kubenswrapper[4911]: I0310 14:49:39.423559 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnjxx" event={"ID":"194679a7-5470-463e-a223-2c77d8e58cef","Type":"ContainerStarted","Data":"376fdc6061e9175f8f39287ad410f932fc989b04da063d68d552c10caf71ba78"} Mar 10 14:49:39 crc kubenswrapper[4911]: I0310 14:49:39.463122 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hnjxx" podStartSLOduration=21.914691177999998 podStartE2EDuration="24.463081626s" podCreationTimestamp="2026-03-10 14:49:15 +0000 UTC" firstStartedPulling="2026-03-10 14:49:36.36471925 +0000 UTC m=+2880.928239167" lastFinishedPulling="2026-03-10 14:49:38.913109698 +0000 UTC m=+2883.476629615" observedRunningTime="2026-03-10 14:49:39.453486939 +0000 UTC m=+2884.017006856" watchObservedRunningTime="2026-03-10 14:49:39.463081626 +0000 UTC m=+2884.026601573" Mar 10 14:49:44 crc kubenswrapper[4911]: E0310 14:49:44.766358 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-conmon-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache]" Mar 10 14:49:46 crc kubenswrapper[4911]: I0310 14:49:46.380529 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:46 crc kubenswrapper[4911]: I0310 14:49:46.380948 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:46 crc kubenswrapper[4911]: I0310 14:49:46.471622 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:46 crc kubenswrapper[4911]: I0310 14:49:46.597476 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:47 crc kubenswrapper[4911]: I0310 14:49:47.227308 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnjxx"] Mar 10 14:49:48 crc kubenswrapper[4911]: I0310 14:49:48.543618 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hnjxx" podUID="194679a7-5470-463e-a223-2c77d8e58cef" containerName="registry-server" containerID="cri-o://376fdc6061e9175f8f39287ad410f932fc989b04da063d68d552c10caf71ba78" gracePeriod=2 Mar 10 14:49:49 crc kubenswrapper[4911]: I0310 14:49:49.563080 4911 generic.go:334] "Generic (PLEG): container finished" podID="194679a7-5470-463e-a223-2c77d8e58cef" containerID="376fdc6061e9175f8f39287ad410f932fc989b04da063d68d552c10caf71ba78" exitCode=0 Mar 10 14:49:49 crc kubenswrapper[4911]: I0310 14:49:49.563205 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnjxx" event={"ID":"194679a7-5470-463e-a223-2c77d8e58cef","Type":"ContainerDied","Data":"376fdc6061e9175f8f39287ad410f932fc989b04da063d68d552c10caf71ba78"} Mar 10 14:49:49 crc kubenswrapper[4911]: I0310 14:49:49.953809 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.055183 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x82bf\" (UniqueName: \"kubernetes.io/projected/194679a7-5470-463e-a223-2c77d8e58cef-kube-api-access-x82bf\") pod \"194679a7-5470-463e-a223-2c77d8e58cef\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.055704 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-catalog-content\") pod \"194679a7-5470-463e-a223-2c77d8e58cef\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.055814 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-utilities\") pod \"194679a7-5470-463e-a223-2c77d8e58cef\" (UID: \"194679a7-5470-463e-a223-2c77d8e58cef\") " Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.057482 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-utilities" (OuterVolumeSpecName: "utilities") pod "194679a7-5470-463e-a223-2c77d8e58cef" (UID: "194679a7-5470-463e-a223-2c77d8e58cef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.062185 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194679a7-5470-463e-a223-2c77d8e58cef-kube-api-access-x82bf" (OuterVolumeSpecName: "kube-api-access-x82bf") pod "194679a7-5470-463e-a223-2c77d8e58cef" (UID: "194679a7-5470-463e-a223-2c77d8e58cef"). InnerVolumeSpecName "kube-api-access-x82bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.125156 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "194679a7-5470-463e-a223-2c77d8e58cef" (UID: "194679a7-5470-463e-a223-2c77d8e58cef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.159855 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.159904 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194679a7-5470-463e-a223-2c77d8e58cef-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.159915 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x82bf\" (UniqueName: \"kubernetes.io/projected/194679a7-5470-463e-a223-2c77d8e58cef-kube-api-access-x82bf\") on node \"crc\" DevicePath \"\"" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.579552 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnjxx" event={"ID":"194679a7-5470-463e-a223-2c77d8e58cef","Type":"ContainerDied","Data":"bfe5fdec1945702fde36fbd617cd38dc7ca551571ad83168eedab03f6648d9ac"} Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.579654 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnjxx" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.580053 4911 scope.go:117] "RemoveContainer" containerID="376fdc6061e9175f8f39287ad410f932fc989b04da063d68d552c10caf71ba78" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.611078 4911 scope.go:117] "RemoveContainer" containerID="175b1245d2756d04069e4e28425b549c2e48b010fa68abbaba51723bcf5b2699" Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.619319 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnjxx"] Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.633560 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hnjxx"] Mar 10 14:49:50 crc kubenswrapper[4911]: I0310 14:49:50.645247 4911 scope.go:117] "RemoveContainer" containerID="921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399" Mar 10 14:49:52 crc kubenswrapper[4911]: I0310 14:49:52.213785 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194679a7-5470-463e-a223-2c77d8e58cef" path="/var/lib/kubelet/pods/194679a7-5470-463e-a223-2c77d8e58cef/volumes" Mar 10 14:49:53 crc kubenswrapper[4911]: I0310 14:49:53.285180 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 14:49:54 crc kubenswrapper[4911]: I0310 14:49:54.625772 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c6a78318-420c-43fe-98f3-9306e18ee2d4","Type":"ContainerStarted","Data":"dfa323fb63696c5eb6e33a9eb4b0973f727d796e2b864d596496753324281a3c"} Mar 10 14:49:54 crc kubenswrapper[4911]: I0310 14:49:54.656688 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.334857721 podStartE2EDuration="55.656613673s" podCreationTimestamp="2026-03-10 14:48:59 +0000 UTC" firstStartedPulling="2026-03-10 14:49:01.959942498 +0000 UTC m=+2846.523462415" lastFinishedPulling="2026-03-10 14:49:53.28169845 +0000 UTC m=+2897.845218367" observedRunningTime="2026-03-10 14:49:54.644456948 +0000 UTC m=+2899.207976865" watchObservedRunningTime="2026-03-10 14:49:54.656613673 +0000 UTC m=+2899.220133590" Mar 10 14:49:55 crc kubenswrapper[4911]: E0310 14:49:55.009413 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-conmon-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache]" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.155050 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552570-8jl6h"] Mar 10 14:50:00 crc kubenswrapper[4911]: E0310 14:50:00.156229 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194679a7-5470-463e-a223-2c77d8e58cef" containerName="extract-utilities" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.156251 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="194679a7-5470-463e-a223-2c77d8e58cef" containerName="extract-utilities" Mar 10 14:50:00 crc kubenswrapper[4911]: E0310 14:50:00.156306 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194679a7-5470-463e-a223-2c77d8e58cef" containerName="extract-content" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.156313 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="194679a7-5470-463e-a223-2c77d8e58cef" containerName="extract-content" Mar 10 14:50:00 crc kubenswrapper[4911]: E0310 14:50:00.156338 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194679a7-5470-463e-a223-2c77d8e58cef" containerName="registry-server" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.156345 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="194679a7-5470-463e-a223-2c77d8e58cef" containerName="registry-server" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.156583 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="194679a7-5470-463e-a223-2c77d8e58cef" containerName="registry-server" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.157498 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552570-8jl6h" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.160567 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.160674 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.161396 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.170074 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552570-8jl6h"] Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.292259 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbwkr\" (UniqueName: \"kubernetes.io/projected/e7bb7062-5130-491c-a4a6-8286475eab11-kube-api-access-xbwkr\") pod \"auto-csr-approver-29552570-8jl6h\" (UID: \"e7bb7062-5130-491c-a4a6-8286475eab11\") " pod="openshift-infra/auto-csr-approver-29552570-8jl6h" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.394443 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbwkr\" (UniqueName: \"kubernetes.io/projected/e7bb7062-5130-491c-a4a6-8286475eab11-kube-api-access-xbwkr\") pod \"auto-csr-approver-29552570-8jl6h\" (UID: \"e7bb7062-5130-491c-a4a6-8286475eab11\") " pod="openshift-infra/auto-csr-approver-29552570-8jl6h" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.415269 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbwkr\" (UniqueName: \"kubernetes.io/projected/e7bb7062-5130-491c-a4a6-8286475eab11-kube-api-access-xbwkr\") pod \"auto-csr-approver-29552570-8jl6h\" (UID: \"e7bb7062-5130-491c-a4a6-8286475eab11\") " pod="openshift-infra/auto-csr-approver-29552570-8jl6h" Mar 10 14:50:00 crc kubenswrapper[4911]: I0310 14:50:00.486424 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552570-8jl6h" Mar 10 14:50:01 crc kubenswrapper[4911]: I0310 14:50:01.049194 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552570-8jl6h"] Mar 10 14:50:01 crc kubenswrapper[4911]: I0310 14:50:01.695411 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552570-8jl6h" event={"ID":"e7bb7062-5130-491c-a4a6-8286475eab11","Type":"ContainerStarted","Data":"b901a2481480ca96f63de530eb913a205af4fbc99df5895ebbc7f010983be6ef"} Mar 10 14:50:03 crc kubenswrapper[4911]: I0310 14:50:03.720658 4911 generic.go:334] "Generic (PLEG): container finished" podID="e7bb7062-5130-491c-a4a6-8286475eab11" containerID="0ac0c963fd9f924bb561eedb09fd0150036c29883719bdce4960212f4ec790b8" exitCode=0 Mar 10 14:50:03 crc kubenswrapper[4911]: I0310 14:50:03.720796 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552570-8jl6h" event={"ID":"e7bb7062-5130-491c-a4a6-8286475eab11","Type":"ContainerDied","Data":"0ac0c963fd9f924bb561eedb09fd0150036c29883719bdce4960212f4ec790b8"} Mar 10 14:50:05 crc kubenswrapper[4911]: I0310 14:50:05.122742 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552570-8jl6h" Mar 10 14:50:05 crc kubenswrapper[4911]: I0310 14:50:05.210571 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbwkr\" (UniqueName: \"kubernetes.io/projected/e7bb7062-5130-491c-a4a6-8286475eab11-kube-api-access-xbwkr\") pod \"e7bb7062-5130-491c-a4a6-8286475eab11\" (UID: \"e7bb7062-5130-491c-a4a6-8286475eab11\") " Mar 10 14:50:05 crc kubenswrapper[4911]: I0310 14:50:05.218454 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bb7062-5130-491c-a4a6-8286475eab11-kube-api-access-xbwkr" (OuterVolumeSpecName: "kube-api-access-xbwkr") pod "e7bb7062-5130-491c-a4a6-8286475eab11" (UID: "e7bb7062-5130-491c-a4a6-8286475eab11"). InnerVolumeSpecName "kube-api-access-xbwkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:50:05 crc kubenswrapper[4911]: E0310 14:50:05.283967 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-conmon-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache]" Mar 10 14:50:05 crc kubenswrapper[4911]: I0310 14:50:05.314808 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbwkr\" (UniqueName: \"kubernetes.io/projected/e7bb7062-5130-491c-a4a6-8286475eab11-kube-api-access-xbwkr\") on node \"crc\" DevicePath \"\"" Mar 10 14:50:05 crc kubenswrapper[4911]: I0310 14:50:05.746479 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552570-8jl6h" event={"ID":"e7bb7062-5130-491c-a4a6-8286475eab11","Type":"ContainerDied","Data":"b901a2481480ca96f63de530eb913a205af4fbc99df5895ebbc7f010983be6ef"} Mar 10 14:50:05 crc kubenswrapper[4911]: I0310 14:50:05.747003 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b901a2481480ca96f63de530eb913a205af4fbc99df5895ebbc7f010983be6ef" Mar 10 14:50:05 crc kubenswrapper[4911]: I0310 14:50:05.747103 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552570-8jl6h" Mar 10 14:50:06 crc kubenswrapper[4911]: I0310 14:50:06.226771 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552564-hqwcw"] Mar 10 14:50:06 crc kubenswrapper[4911]: I0310 14:50:06.238705 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552564-hqwcw"] Mar 10 14:50:08 crc kubenswrapper[4911]: I0310 14:50:08.208495 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63bcf83b-001c-4b6a-9aa1-df8fbd2f5491" path="/var/lib/kubelet/pods/63bcf83b-001c-4b6a-9aa1-df8fbd2f5491/volumes" Mar 10 14:50:15 crc kubenswrapper[4911]: E0310 14:50:15.545369 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-conmon-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache]" Mar 10 14:50:18 crc kubenswrapper[4911]: I0310 14:50:18.521329 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:50:18 crc kubenswrapper[4911]: I0310 14:50:18.522111 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:50:25 crc kubenswrapper[4911]: E0310 14:50:25.826368 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-conmon-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache]" Mar 10 14:50:36 crc kubenswrapper[4911]: E0310 14:50:36.103299 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-conmon-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194679a7_5470_463e_a223_2c77d8e58cef.slice/crio-921c1ddc2fc03be4d8c6bfc4a9d7a55b4fea4fde8aa422748bd06450b0e83399.scope\": RecentStats: unable to find data in memory cache]" Mar 10 14:50:48 crc kubenswrapper[4911]: I0310 14:50:48.520506 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:50:48 crc kubenswrapper[4911]: I0310 14:50:48.523230 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:50:56 crc kubenswrapper[4911]: I0310 14:50:56.866183 4911 scope.go:117] "RemoveContainer" containerID="a2f2f014bcfa233463f20f997ea0c30c6ee07019fe84aea19b9e64445fb18fe3" Mar 10 14:51:08 crc kubenswrapper[4911]: I0310 14:51:08.870442 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ppb77"] Mar 10 14:51:08 crc kubenswrapper[4911]: E0310 14:51:08.871630 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bb7062-5130-491c-a4a6-8286475eab11" containerName="oc" Mar 10 14:51:08 crc kubenswrapper[4911]: I0310 14:51:08.871646 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bb7062-5130-491c-a4a6-8286475eab11" containerName="oc" Mar 10 14:51:08 crc kubenswrapper[4911]: I0310 14:51:08.871880 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bb7062-5130-491c-a4a6-8286475eab11" containerName="oc" Mar 10 14:51:08 crc kubenswrapper[4911]: I0310 14:51:08.873396 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:08 crc kubenswrapper[4911]: I0310 14:51:08.890478 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ppb77"] Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.043973 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-utilities\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.044042 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwr4\" (UniqueName: \"kubernetes.io/projected/e8569eff-ea1d-427c-8e0b-e1e34984fce8-kube-api-access-ggwr4\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.044281 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-catalog-content\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.147533 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-utilities\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.147592 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwr4\" (UniqueName: \"kubernetes.io/projected/e8569eff-ea1d-427c-8e0b-e1e34984fce8-kube-api-access-ggwr4\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.147657 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-catalog-content\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.148135 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-utilities\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.148172 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-catalog-content\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.172596 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwr4\" (UniqueName: \"kubernetes.io/projected/e8569eff-ea1d-427c-8e0b-e1e34984fce8-kube-api-access-ggwr4\") pod \"certified-operators-ppb77\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.204516 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:09 crc kubenswrapper[4911]: I0310 14:51:09.720818 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ppb77"] Mar 10 14:51:10 crc kubenswrapper[4911]: I0310 14:51:10.422708 4911 generic.go:334] "Generic (PLEG): container finished" podID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerID="82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f" exitCode=0 Mar 10 14:51:10 crc kubenswrapper[4911]: I0310 14:51:10.422896 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppb77" event={"ID":"e8569eff-ea1d-427c-8e0b-e1e34984fce8","Type":"ContainerDied","Data":"82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f"} Mar 10 14:51:10 crc kubenswrapper[4911]: I0310 14:51:10.422972 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppb77" event={"ID":"e8569eff-ea1d-427c-8e0b-e1e34984fce8","Type":"ContainerStarted","Data":"d51935873450d0350791d6f6d43235760220b3cf1043b123ab0afca8c9d908bf"} Mar 10 14:51:11 crc kubenswrapper[4911]: I0310 14:51:11.439643 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppb77" event={"ID":"e8569eff-ea1d-427c-8e0b-e1e34984fce8","Type":"ContainerStarted","Data":"4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6"} Mar 10 14:51:12 crc kubenswrapper[4911]: I0310 14:51:12.452331 4911 generic.go:334] "Generic (PLEG): container finished" podID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerID="4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6" exitCode=0 Mar 10 14:51:12 crc kubenswrapper[4911]: I0310 14:51:12.452447 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppb77" event={"ID":"e8569eff-ea1d-427c-8e0b-e1e34984fce8","Type":"ContainerDied","Data":"4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6"} Mar 10 14:51:13 crc kubenswrapper[4911]: I0310 14:51:13.469274 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppb77" event={"ID":"e8569eff-ea1d-427c-8e0b-e1e34984fce8","Type":"ContainerStarted","Data":"1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525"} Mar 10 14:51:13 crc kubenswrapper[4911]: I0310 14:51:13.498585 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ppb77" podStartSLOduration=3.09559814 podStartE2EDuration="5.498553736s" podCreationTimestamp="2026-03-10 14:51:08 +0000 UTC" firstStartedPulling="2026-03-10 14:51:10.426434513 +0000 UTC m=+2974.989954430" lastFinishedPulling="2026-03-10 14:51:12.829390109 +0000 UTC m=+2977.392910026" observedRunningTime="2026-03-10 14:51:13.495983717 +0000 UTC m=+2978.059503634" watchObservedRunningTime="2026-03-10 14:51:13.498553736 +0000 UTC m=+2978.062073663" Mar 10 14:51:18 crc kubenswrapper[4911]: I0310 14:51:18.520775 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:51:18 crc kubenswrapper[4911]: I0310 14:51:18.521311 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:51:18 crc kubenswrapper[4911]: I0310 14:51:18.521368 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:51:18 crc kubenswrapper[4911]: I0310 14:51:18.522376 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:51:18 crc kubenswrapper[4911]: I0310 14:51:18.522436 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" gracePeriod=600 Mar 10 14:51:18 crc kubenswrapper[4911]: E0310 14:51:18.658581 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:51:19 crc kubenswrapper[4911]: I0310 14:51:19.204772 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:19 crc kubenswrapper[4911]: I0310 14:51:19.204945 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:19 crc kubenswrapper[4911]: I0310 14:51:19.253760 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:19 crc kubenswrapper[4911]: I0310 14:51:19.548875 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" exitCode=0 Mar 10 14:51:19 crc kubenswrapper[4911]: I0310 14:51:19.548912 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b"} Mar 10 14:51:19 crc kubenswrapper[4911]: I0310 14:51:19.549135 4911 scope.go:117] "RemoveContainer" containerID="c92603dfaddc182b1a9ac490086f6b37810dc88d297b219df86ab89393789379" Mar 10 14:51:19 crc kubenswrapper[4911]: I0310 14:51:19.550623 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:51:19 crc kubenswrapper[4911]: E0310 14:51:19.551291 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:51:19 crc kubenswrapper[4911]: I0310 14:51:19.631211 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:20 crc kubenswrapper[4911]: I0310 14:51:20.256886 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ppb77"] Mar 10 14:51:21 crc kubenswrapper[4911]: I0310 14:51:21.571970 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ppb77" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerName="registry-server" containerID="cri-o://1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525" gracePeriod=2 Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.079346 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.269095 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-utilities\") pod \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.269325 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-catalog-content\") pod \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.269368 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggwr4\" (UniqueName: \"kubernetes.io/projected/e8569eff-ea1d-427c-8e0b-e1e34984fce8-kube-api-access-ggwr4\") pod \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\" (UID: \"e8569eff-ea1d-427c-8e0b-e1e34984fce8\") " Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.270589 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-utilities" (OuterVolumeSpecName: "utilities") pod "e8569eff-ea1d-427c-8e0b-e1e34984fce8" (UID: "e8569eff-ea1d-427c-8e0b-e1e34984fce8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.276665 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8569eff-ea1d-427c-8e0b-e1e34984fce8-kube-api-access-ggwr4" (OuterVolumeSpecName: "kube-api-access-ggwr4") pod "e8569eff-ea1d-427c-8e0b-e1e34984fce8" (UID: "e8569eff-ea1d-427c-8e0b-e1e34984fce8"). InnerVolumeSpecName "kube-api-access-ggwr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.373001 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggwr4\" (UniqueName: \"kubernetes.io/projected/e8569eff-ea1d-427c-8e0b-e1e34984fce8-kube-api-access-ggwr4\") on node \"crc\" DevicePath \"\"" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.373056 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.589085 4911 generic.go:334] "Generic (PLEG): container finished" podID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerID="1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525" exitCode=0 Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.589146 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppb77" event={"ID":"e8569eff-ea1d-427c-8e0b-e1e34984fce8","Type":"ContainerDied","Data":"1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525"} Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.589197 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppb77" event={"ID":"e8569eff-ea1d-427c-8e0b-e1e34984fce8","Type":"ContainerDied","Data":"d51935873450d0350791d6f6d43235760220b3cf1043b123ab0afca8c9d908bf"} Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.589219 4911 scope.go:117] "RemoveContainer" containerID="1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.589184 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppb77" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.638025 4911 scope.go:117] "RemoveContainer" containerID="4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.678984 4911 scope.go:117] "RemoveContainer" containerID="82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.728058 4911 scope.go:117] "RemoveContainer" containerID="1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525" Mar 10 14:51:22 crc kubenswrapper[4911]: E0310 14:51:22.728876 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525\": container with ID starting with 1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525 not found: ID does not exist" containerID="1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.728942 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525"} err="failed to get container status \"1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525\": rpc error: code = NotFound desc = could not find container \"1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525\": container with ID starting with 1a0274f844f0a2b8b1f350f75278865face99322d9d6cad0f6de63444c3bc525 not found: ID does not exist" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.728976 4911 scope.go:117] "RemoveContainer" containerID="4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6" Mar 10 14:51:22 crc kubenswrapper[4911]: E0310 14:51:22.729388 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6\": container with ID starting with 4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6 not found: ID does not exist" containerID="4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.729436 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6"} err="failed to get container status \"4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6\": rpc error: code = NotFound desc = could not find container \"4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6\": container with ID starting with 4a229300cca8aabefd9edcf8c917abbbf6f2bf3663e80dd676f1831e375a4fb6 not found: ID does not exist" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.729475 4911 scope.go:117] "RemoveContainer" containerID="82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f" Mar 10 14:51:22 crc kubenswrapper[4911]: E0310 14:51:22.730108 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f\": container with ID starting with 82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f not found: ID does not exist" containerID="82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.730147 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f"} err="failed to get container status \"82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f\": rpc error: code = NotFound desc = could not find container \"82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f\": container with ID starting with 82ee51ff91b3a85768351a470544a7893c6017dfdb114d1fc9cbd30f0442ec1f not found: ID does not exist" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.876089 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8569eff-ea1d-427c-8e0b-e1e34984fce8" (UID: "e8569eff-ea1d-427c-8e0b-e1e34984fce8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.885631 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8569eff-ea1d-427c-8e0b-e1e34984fce8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.930981 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ppb77"] Mar 10 14:51:22 crc kubenswrapper[4911]: I0310 14:51:22.939785 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ppb77"] Mar 10 14:51:24 crc kubenswrapper[4911]: I0310 14:51:24.206594 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" path="/var/lib/kubelet/pods/e8569eff-ea1d-427c-8e0b-e1e34984fce8/volumes" Mar 10 14:51:33 crc kubenswrapper[4911]: I0310 14:51:33.193503 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:51:33 crc kubenswrapper[4911]: E0310 14:51:33.194334 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:51:47 crc kubenswrapper[4911]: I0310 14:51:47.194501 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:51:47 crc kubenswrapper[4911]: E0310 14:51:47.195973 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.152001 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552572-r952k"] Mar 10 14:52:00 crc kubenswrapper[4911]: E0310 14:52:00.153714 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerName="registry-server" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.153759 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerName="registry-server" Mar 10 14:52:00 crc kubenswrapper[4911]: E0310 14:52:00.153790 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerName="extract-content" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.153799 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerName="extract-content" Mar 10 14:52:00 crc kubenswrapper[4911]: E0310 14:52:00.153823 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerName="extract-utilities" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.153839 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerName="extract-utilities" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.154120 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8569eff-ea1d-427c-8e0b-e1e34984fce8" containerName="registry-server" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.155214 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552572-r952k" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.159272 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.162464 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552572-r952k"] Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.162866 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.164268 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.258445 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkg66\" (UniqueName: \"kubernetes.io/projected/5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe-kube-api-access-pkg66\") pod \"auto-csr-approver-29552572-r952k\" (UID: \"5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe\") " pod="openshift-infra/auto-csr-approver-29552572-r952k" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.361050 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkg66\" (UniqueName: \"kubernetes.io/projected/5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe-kube-api-access-pkg66\") pod \"auto-csr-approver-29552572-r952k\" (UID: \"5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe\") " pod="openshift-infra/auto-csr-approver-29552572-r952k" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.382564 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkg66\" (UniqueName: \"kubernetes.io/projected/5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe-kube-api-access-pkg66\") pod \"auto-csr-approver-29552572-r952k\" (UID: \"5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe\") " pod="openshift-infra/auto-csr-approver-29552572-r952k" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.476385 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552572-r952k" Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.974158 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552572-r952k"] Mar 10 14:52:00 crc kubenswrapper[4911]: I0310 14:52:00.999446 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552572-r952k" event={"ID":"5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe","Type":"ContainerStarted","Data":"d8ace094d2894903923c2a6e4acbebc6feaf3356c9480bd0e114c0f1d7e0b509"} Mar 10 14:52:01 crc kubenswrapper[4911]: I0310 14:52:01.194389 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:52:01 crc kubenswrapper[4911]: E0310 14:52:01.197212 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:52:03 crc kubenswrapper[4911]: I0310 14:52:03.023995 4911 generic.go:334] "Generic (PLEG): container finished" podID="5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe" containerID="dd7d2c6531c590bb40a27ef0f1c5ff5ee2086c3a826a5605686ae5f26c74825e" exitCode=0 Mar 10 14:52:03 crc kubenswrapper[4911]: I0310 14:52:03.024491 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552572-r952k" event={"ID":"5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe","Type":"ContainerDied","Data":"dd7d2c6531c590bb40a27ef0f1c5ff5ee2086c3a826a5605686ae5f26c74825e"} Mar 10 14:52:04 crc kubenswrapper[4911]: I0310 14:52:04.502822 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552572-r952k" Mar 10 14:52:04 crc kubenswrapper[4911]: I0310 14:52:04.570817 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkg66\" (UniqueName: \"kubernetes.io/projected/5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe-kube-api-access-pkg66\") pod \"5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe\" (UID: \"5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe\") " Mar 10 14:52:04 crc kubenswrapper[4911]: I0310 14:52:04.579243 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe-kube-api-access-pkg66" (OuterVolumeSpecName: "kube-api-access-pkg66") pod "5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe" (UID: "5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe"). InnerVolumeSpecName "kube-api-access-pkg66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:52:04 crc kubenswrapper[4911]: I0310 14:52:04.673973 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkg66\" (UniqueName: \"kubernetes.io/projected/5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe-kube-api-access-pkg66\") on node \"crc\" DevicePath \"\"" Mar 10 14:52:05 crc kubenswrapper[4911]: I0310 14:52:05.053143 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552572-r952k" event={"ID":"5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe","Type":"ContainerDied","Data":"d8ace094d2894903923c2a6e4acbebc6feaf3356c9480bd0e114c0f1d7e0b509"} Mar 10 14:52:05 crc kubenswrapper[4911]: I0310 14:52:05.053211 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8ace094d2894903923c2a6e4acbebc6feaf3356c9480bd0e114c0f1d7e0b509" Mar 10 14:52:05 crc kubenswrapper[4911]: I0310 14:52:05.053247 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552572-r952k" Mar 10 14:52:05 crc kubenswrapper[4911]: I0310 14:52:05.597318 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552566-c2rz2"] Mar 10 14:52:05 crc kubenswrapper[4911]: I0310 14:52:05.608005 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552566-c2rz2"] Mar 10 14:52:06 crc kubenswrapper[4911]: I0310 14:52:06.216410 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a0ac3c-c2e3-41d9-9da5-24745865c184" path="/var/lib/kubelet/pods/97a0ac3c-c2e3-41d9-9da5-24745865c184/volumes" Mar 10 14:52:16 crc kubenswrapper[4911]: I0310 14:52:16.204480 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:52:16 crc kubenswrapper[4911]: E0310 14:52:16.205461 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:52:27 crc kubenswrapper[4911]: I0310 14:52:27.193161 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:52:27 crc kubenswrapper[4911]: E0310 14:52:27.193838 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:52:40 crc kubenswrapper[4911]: I0310 14:52:40.194342 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:52:40 crc kubenswrapper[4911]: E0310 14:52:40.195335 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:52:52 crc kubenswrapper[4911]: I0310 14:52:52.193533 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:52:52 crc kubenswrapper[4911]: E0310 14:52:52.194448 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:52:57 crc kubenswrapper[4911]: I0310 14:52:57.051662 4911 scope.go:117] "RemoveContainer" containerID="77425a8168accf4bb764468f4730f411992b0679c72425ac86375a77239f9719" Mar 10 14:53:04 crc kubenswrapper[4911]: I0310 14:53:04.193472 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:53:04 crc kubenswrapper[4911]: E0310 14:53:04.194259 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:53:17 crc kubenswrapper[4911]: I0310 14:53:17.194148 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:53:17 crc kubenswrapper[4911]: E0310 14:53:17.195058 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:53:30 crc kubenswrapper[4911]: I0310 14:53:30.193556 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:53:30 crc kubenswrapper[4911]: E0310 14:53:30.194566 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:53:43 crc kubenswrapper[4911]: I0310 14:53:43.193463 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:53:43 crc kubenswrapper[4911]: E0310 14:53:43.194381 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:53:55 crc kubenswrapper[4911]: I0310 14:53:55.193600 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:53:55 crc kubenswrapper[4911]: E0310 14:53:55.194366 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.154872 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552574-j4t7l"] Mar 10 14:54:00 crc kubenswrapper[4911]: E0310 14:54:00.156572 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe" containerName="oc" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.156594 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe" containerName="oc" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.156864 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe" containerName="oc" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.158018 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552574-j4t7l" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.161301 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.161744 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.162843 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsvm6\" (UniqueName: \"kubernetes.io/projected/998046ee-be1e-4a06-a0f5-91f9c808b53f-kube-api-access-nsvm6\") pod \"auto-csr-approver-29552574-j4t7l\" (UID: \"998046ee-be1e-4a06-a0f5-91f9c808b53f\") " pod="openshift-infra/auto-csr-approver-29552574-j4t7l" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.163829 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.165667 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552574-j4t7l"] Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.264563 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsvm6\" (UniqueName: \"kubernetes.io/projected/998046ee-be1e-4a06-a0f5-91f9c808b53f-kube-api-access-nsvm6\") pod \"auto-csr-approver-29552574-j4t7l\" (UID: \"998046ee-be1e-4a06-a0f5-91f9c808b53f\") " pod="openshift-infra/auto-csr-approver-29552574-j4t7l" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.288760 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsvm6\" (UniqueName: \"kubernetes.io/projected/998046ee-be1e-4a06-a0f5-91f9c808b53f-kube-api-access-nsvm6\") pod \"auto-csr-approver-29552574-j4t7l\" (UID: \"998046ee-be1e-4a06-a0f5-91f9c808b53f\") " pod="openshift-infra/auto-csr-approver-29552574-j4t7l" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.483805 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552574-j4t7l" Mar 10 14:54:00 crc kubenswrapper[4911]: I0310 14:54:00.980095 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552574-j4t7l"] Mar 10 14:54:01 crc kubenswrapper[4911]: I0310 14:54:01.272968 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552574-j4t7l" event={"ID":"998046ee-be1e-4a06-a0f5-91f9c808b53f","Type":"ContainerStarted","Data":"64740f3df9ed1438b34eabf5a43cf0389cb3df29e84503590ce3aab681164e22"} Mar 10 14:54:03 crc kubenswrapper[4911]: I0310 14:54:03.298496 4911 generic.go:334] "Generic (PLEG): container finished" podID="998046ee-be1e-4a06-a0f5-91f9c808b53f" containerID="79c5f9eb6d80cff3d980466a6c90591a2a0f50f81bd35c2f259eb4d072d6c8a2" exitCode=0 Mar 10 14:54:03 crc kubenswrapper[4911]: I0310 14:54:03.298607 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552574-j4t7l" event={"ID":"998046ee-be1e-4a06-a0f5-91f9c808b53f","Type":"ContainerDied","Data":"79c5f9eb6d80cff3d980466a6c90591a2a0f50f81bd35c2f259eb4d072d6c8a2"} Mar 10 14:54:04 crc kubenswrapper[4911]: I0310 14:54:04.731952 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552574-j4t7l" Mar 10 14:54:04 crc kubenswrapper[4911]: I0310 14:54:04.857806 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsvm6\" (UniqueName: \"kubernetes.io/projected/998046ee-be1e-4a06-a0f5-91f9c808b53f-kube-api-access-nsvm6\") pod \"998046ee-be1e-4a06-a0f5-91f9c808b53f\" (UID: \"998046ee-be1e-4a06-a0f5-91f9c808b53f\") " Mar 10 14:54:04 crc kubenswrapper[4911]: I0310 14:54:04.863259 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998046ee-be1e-4a06-a0f5-91f9c808b53f-kube-api-access-nsvm6" (OuterVolumeSpecName: "kube-api-access-nsvm6") pod "998046ee-be1e-4a06-a0f5-91f9c808b53f" (UID: "998046ee-be1e-4a06-a0f5-91f9c808b53f"). InnerVolumeSpecName "kube-api-access-nsvm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:54:04 crc kubenswrapper[4911]: I0310 14:54:04.960613 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsvm6\" (UniqueName: \"kubernetes.io/projected/998046ee-be1e-4a06-a0f5-91f9c808b53f-kube-api-access-nsvm6\") on node \"crc\" DevicePath \"\"" Mar 10 14:54:05 crc kubenswrapper[4911]: I0310 14:54:05.323974 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552574-j4t7l" event={"ID":"998046ee-be1e-4a06-a0f5-91f9c808b53f","Type":"ContainerDied","Data":"64740f3df9ed1438b34eabf5a43cf0389cb3df29e84503590ce3aab681164e22"} Mar 10 14:54:05 crc kubenswrapper[4911]: I0310 14:54:05.324029 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64740f3df9ed1438b34eabf5a43cf0389cb3df29e84503590ce3aab681164e22" Mar 10 14:54:05 crc kubenswrapper[4911]: I0310 14:54:05.324114 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552574-j4t7l" Mar 10 14:54:05 crc kubenswrapper[4911]: I0310 14:54:05.820003 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552568-z8bl8"] Mar 10 14:54:05 crc kubenswrapper[4911]: I0310 14:54:05.831238 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552568-z8bl8"] Mar 10 14:54:06 crc kubenswrapper[4911]: I0310 14:54:06.208716 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e937b10-d5be-4e52-a7dc-fc92ab37a169" path="/var/lib/kubelet/pods/1e937b10-d5be-4e52-a7dc-fc92ab37a169/volumes" Mar 10 14:54:08 crc kubenswrapper[4911]: I0310 14:54:08.194992 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:54:08 crc kubenswrapper[4911]: E0310 14:54:08.195557 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:54:21 crc kubenswrapper[4911]: I0310 14:54:21.194482 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:54:21 crc kubenswrapper[4911]: E0310 14:54:21.195760 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:54:33 crc kubenswrapper[4911]: I0310 14:54:33.194436 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:54:33 crc kubenswrapper[4911]: E0310 14:54:33.195187 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:54:48 crc kubenswrapper[4911]: I0310 14:54:48.194353 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:54:48 crc kubenswrapper[4911]: E0310 14:54:48.195507 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:54:57 crc kubenswrapper[4911]: I0310 14:54:57.161238 4911 scope.go:117] "RemoveContainer" containerID="b7215ae128e520b3805668a323673602781c86132e79c5ab649de8e3f2b53377" Mar 10 14:54:59 crc kubenswrapper[4911]: I0310 14:54:59.194609 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:54:59 crc kubenswrapper[4911]: E0310 14:54:59.196619 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:55:14 crc kubenswrapper[4911]: I0310 14:55:14.193607 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:55:14 crc kubenswrapper[4911]: E0310 14:55:14.194383 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:55:26 crc kubenswrapper[4911]: I0310 14:55:26.198775 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:55:26 crc kubenswrapper[4911]: E0310 14:55:26.199644 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:55:39 crc kubenswrapper[4911]: I0310 14:55:39.194183 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:55:39 crc kubenswrapper[4911]: E0310 14:55:39.195004 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:55:50 crc kubenswrapper[4911]: I0310 14:55:50.194710 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:55:50 crc kubenswrapper[4911]: E0310 14:55:50.196227 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.149506 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552576-9kx6p"] Mar 10 14:56:00 crc kubenswrapper[4911]: E0310 14:56:00.150610 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998046ee-be1e-4a06-a0f5-91f9c808b53f" containerName="oc" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.150630 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="998046ee-be1e-4a06-a0f5-91f9c808b53f" containerName="oc" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.150880 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="998046ee-be1e-4a06-a0f5-91f9c808b53f" containerName="oc" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.151814 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552576-9kx6p" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.154707 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.155011 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.155154 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.167498 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552576-9kx6p"] Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.308055 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2fzg\" (UniqueName: \"kubernetes.io/projected/47b1a3fe-2219-4c28-9b35-f1e0a9e4977c-kube-api-access-h2fzg\") pod \"auto-csr-approver-29552576-9kx6p\" (UID: \"47b1a3fe-2219-4c28-9b35-f1e0a9e4977c\") " pod="openshift-infra/auto-csr-approver-29552576-9kx6p" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.411065 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2fzg\" (UniqueName: \"kubernetes.io/projected/47b1a3fe-2219-4c28-9b35-f1e0a9e4977c-kube-api-access-h2fzg\") pod \"auto-csr-approver-29552576-9kx6p\" (UID: \"47b1a3fe-2219-4c28-9b35-f1e0a9e4977c\") " pod="openshift-infra/auto-csr-approver-29552576-9kx6p" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.431899 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2fzg\" (UniqueName: \"kubernetes.io/projected/47b1a3fe-2219-4c28-9b35-f1e0a9e4977c-kube-api-access-h2fzg\") pod \"auto-csr-approver-29552576-9kx6p\" (UID: \"47b1a3fe-2219-4c28-9b35-f1e0a9e4977c\") " pod="openshift-infra/auto-csr-approver-29552576-9kx6p" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.475850 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552576-9kx6p" Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.959952 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 14:56:00 crc kubenswrapper[4911]: I0310 14:56:00.967417 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552576-9kx6p"] Mar 10 14:56:01 crc kubenswrapper[4911]: I0310 14:56:01.948022 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552576-9kx6p" event={"ID":"47b1a3fe-2219-4c28-9b35-f1e0a9e4977c","Type":"ContainerStarted","Data":"7e6ceabdfc9d603c7a4b138d7782e614e2dcb124aa00777443181e2d37980cfa"} Mar 10 14:56:02 crc kubenswrapper[4911]: I0310 14:56:02.958766 4911 generic.go:334] "Generic (PLEG): container finished" podID="47b1a3fe-2219-4c28-9b35-f1e0a9e4977c" containerID="eddcb6b3f5cd7a6611d592b1865b61840f6dcebbca6b3b5ffa641285ac14b120" exitCode=0 Mar 10 14:56:02 crc kubenswrapper[4911]: I0310 14:56:02.958868 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552576-9kx6p" event={"ID":"47b1a3fe-2219-4c28-9b35-f1e0a9e4977c","Type":"ContainerDied","Data":"eddcb6b3f5cd7a6611d592b1865b61840f6dcebbca6b3b5ffa641285ac14b120"} Mar 10 14:56:03 crc kubenswrapper[4911]: I0310 14:56:03.194109 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:56:03 crc kubenswrapper[4911]: E0310 14:56:03.194361 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:56:04 crc kubenswrapper[4911]: I0310 14:56:04.425248 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552576-9kx6p" Mar 10 14:56:04 crc kubenswrapper[4911]: I0310 14:56:04.608760 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2fzg\" (UniqueName: \"kubernetes.io/projected/47b1a3fe-2219-4c28-9b35-f1e0a9e4977c-kube-api-access-h2fzg\") pod \"47b1a3fe-2219-4c28-9b35-f1e0a9e4977c\" (UID: \"47b1a3fe-2219-4c28-9b35-f1e0a9e4977c\") " Mar 10 14:56:04 crc kubenswrapper[4911]: I0310 14:56:04.614137 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b1a3fe-2219-4c28-9b35-f1e0a9e4977c-kube-api-access-h2fzg" (OuterVolumeSpecName: "kube-api-access-h2fzg") pod "47b1a3fe-2219-4c28-9b35-f1e0a9e4977c" (UID: "47b1a3fe-2219-4c28-9b35-f1e0a9e4977c"). InnerVolumeSpecName "kube-api-access-h2fzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:56:04 crc kubenswrapper[4911]: I0310 14:56:04.711654 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2fzg\" (UniqueName: \"kubernetes.io/projected/47b1a3fe-2219-4c28-9b35-f1e0a9e4977c-kube-api-access-h2fzg\") on node \"crc\" DevicePath \"\"" Mar 10 14:56:04 crc kubenswrapper[4911]: I0310 14:56:04.978235 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552576-9kx6p" event={"ID":"47b1a3fe-2219-4c28-9b35-f1e0a9e4977c","Type":"ContainerDied","Data":"7e6ceabdfc9d603c7a4b138d7782e614e2dcb124aa00777443181e2d37980cfa"} Mar 10 14:56:04 crc kubenswrapper[4911]: I0310 14:56:04.978284 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e6ceabdfc9d603c7a4b138d7782e614e2dcb124aa00777443181e2d37980cfa" Mar 10 14:56:04 crc kubenswrapper[4911]: I0310 14:56:04.978303 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552576-9kx6p" Mar 10 14:56:05 crc kubenswrapper[4911]: I0310 14:56:05.502205 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552570-8jl6h"] Mar 10 14:56:05 crc kubenswrapper[4911]: I0310 14:56:05.512275 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552570-8jl6h"] Mar 10 14:56:06 crc kubenswrapper[4911]: I0310 14:56:06.210899 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bb7062-5130-491c-a4a6-8286475eab11" path="/var/lib/kubelet/pods/e7bb7062-5130-491c-a4a6-8286475eab11/volumes" Mar 10 14:56:14 crc kubenswrapper[4911]: I0310 14:56:14.194880 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:56:14 crc kubenswrapper[4911]: E0310 14:56:14.196073 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 14:56:29 crc kubenswrapper[4911]: I0310 14:56:29.195024 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:56:30 crc kubenswrapper[4911]: I0310 14:56:30.225282 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"19ad5193c0bc4ed506cb6042cc3e642a139d1b08dd49628d1a8a9e56568fa86f"} Mar 10 14:56:57 crc kubenswrapper[4911]: I0310 14:56:57.254129 4911 scope.go:117] "RemoveContainer" containerID="0ac0c963fd9f924bb561eedb09fd0150036c29883719bdce4960212f4ec790b8" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.451489 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bs5vx"] Mar 10 14:57:05 crc kubenswrapper[4911]: E0310 14:57:05.452942 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b1a3fe-2219-4c28-9b35-f1e0a9e4977c" containerName="oc" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.452976 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b1a3fe-2219-4c28-9b35-f1e0a9e4977c" containerName="oc" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.453305 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b1a3fe-2219-4c28-9b35-f1e0a9e4977c" containerName="oc" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.455536 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.468422 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs5vx"] Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.552553 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-utilities\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.552635 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-catalog-content\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.552661 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n584f\" (UniqueName: \"kubernetes.io/projected/41a860a8-7190-4879-b159-7aaf49a4a9d8-kube-api-access-n584f\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.655003 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n584f\" (UniqueName: \"kubernetes.io/projected/41a860a8-7190-4879-b159-7aaf49a4a9d8-kube-api-access-n584f\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.655191 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-utilities\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.655239 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-catalog-content\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.655894 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-utilities\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.656232 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-catalog-content\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.677302 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n584f\" (UniqueName: \"kubernetes.io/projected/41a860a8-7190-4879-b159-7aaf49a4a9d8-kube-api-access-n584f\") pod \"redhat-operators-bs5vx\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:05 crc kubenswrapper[4911]: I0310 14:57:05.780912 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:06 crc kubenswrapper[4911]: I0310 14:57:06.292193 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs5vx"] Mar 10 14:57:06 crc kubenswrapper[4911]: I0310 14:57:06.625783 4911 generic.go:334] "Generic (PLEG): container finished" podID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerID="e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17" exitCode=0 Mar 10 14:57:06 crc kubenswrapper[4911]: I0310 14:57:06.625882 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5vx" event={"ID":"41a860a8-7190-4879-b159-7aaf49a4a9d8","Type":"ContainerDied","Data":"e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17"} Mar 10 14:57:06 crc kubenswrapper[4911]: I0310 14:57:06.626153 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5vx" event={"ID":"41a860a8-7190-4879-b159-7aaf49a4a9d8","Type":"ContainerStarted","Data":"c5a3010558e4eeb61df03bce1cddb6f7d9188604a9506747e309dc863ae56ac3"} Mar 10 14:57:08 crc kubenswrapper[4911]: I0310 14:57:08.653752 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5vx" event={"ID":"41a860a8-7190-4879-b159-7aaf49a4a9d8","Type":"ContainerStarted","Data":"1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a"} Mar 10 14:57:10 crc kubenswrapper[4911]: I0310 14:57:10.676311 4911 generic.go:334] "Generic (PLEG): container finished" podID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerID="1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a" exitCode=0 Mar 10 14:57:10 crc kubenswrapper[4911]: I0310 14:57:10.676412 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5vx" event={"ID":"41a860a8-7190-4879-b159-7aaf49a4a9d8","Type":"ContainerDied","Data":"1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a"} Mar 10 14:57:11 crc kubenswrapper[4911]: I0310 14:57:11.689932 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5vx" event={"ID":"41a860a8-7190-4879-b159-7aaf49a4a9d8","Type":"ContainerStarted","Data":"7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7"} Mar 10 14:57:11 crc kubenswrapper[4911]: I0310 14:57:11.717784 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bs5vx" podStartSLOduration=2.261006278 podStartE2EDuration="6.717739721s" podCreationTimestamp="2026-03-10 14:57:05 +0000 UTC" firstStartedPulling="2026-03-10 14:57:06.627843324 +0000 UTC m=+3331.191363241" lastFinishedPulling="2026-03-10 14:57:11.084576767 +0000 UTC m=+3335.648096684" observedRunningTime="2026-03-10 14:57:11.711040671 +0000 UTC m=+3336.274560588" watchObservedRunningTime="2026-03-10 14:57:11.717739721 +0000 UTC m=+3336.281259638" Mar 10 14:57:15 crc kubenswrapper[4911]: I0310 14:57:15.781283 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:15 crc kubenswrapper[4911]: I0310 14:57:15.781971 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:16 crc kubenswrapper[4911]: I0310 14:57:16.828527 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs5vx" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="registry-server" probeResult="failure" output=< Mar 10 14:57:16 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 14:57:16 crc kubenswrapper[4911]: > Mar 10 14:57:26 crc kubenswrapper[4911]: I0310 14:57:26.834768 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs5vx" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="registry-server" probeResult="failure" output=< Mar 10 14:57:26 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 14:57:26 crc kubenswrapper[4911]: > Mar 10 14:57:35 crc kubenswrapper[4911]: I0310 14:57:35.833024 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:35 crc kubenswrapper[4911]: I0310 14:57:35.916810 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:36 crc kubenswrapper[4911]: I0310 14:57:36.651525 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs5vx"] Mar 10 14:57:36 crc kubenswrapper[4911]: I0310 14:57:36.938456 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bs5vx" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="registry-server" containerID="cri-o://7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7" gracePeriod=2 Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.497611 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.546455 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-utilities\") pod \"41a860a8-7190-4879-b159-7aaf49a4a9d8\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.546887 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-catalog-content\") pod \"41a860a8-7190-4879-b159-7aaf49a4a9d8\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.546997 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n584f\" (UniqueName: \"kubernetes.io/projected/41a860a8-7190-4879-b159-7aaf49a4a9d8-kube-api-access-n584f\") pod \"41a860a8-7190-4879-b159-7aaf49a4a9d8\" (UID: \"41a860a8-7190-4879-b159-7aaf49a4a9d8\") " Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.547252 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-utilities" (OuterVolumeSpecName: "utilities") pod "41a860a8-7190-4879-b159-7aaf49a4a9d8" (UID: "41a860a8-7190-4879-b159-7aaf49a4a9d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.548384 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.555599 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a860a8-7190-4879-b159-7aaf49a4a9d8-kube-api-access-n584f" (OuterVolumeSpecName: "kube-api-access-n584f") pod "41a860a8-7190-4879-b159-7aaf49a4a9d8" (UID: "41a860a8-7190-4879-b159-7aaf49a4a9d8"). InnerVolumeSpecName "kube-api-access-n584f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.650982 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n584f\" (UniqueName: \"kubernetes.io/projected/41a860a8-7190-4879-b159-7aaf49a4a9d8-kube-api-access-n584f\") on node \"crc\" DevicePath \"\"" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.713960 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41a860a8-7190-4879-b159-7aaf49a4a9d8" (UID: "41a860a8-7190-4879-b159-7aaf49a4a9d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.752974 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a860a8-7190-4879-b159-7aaf49a4a9d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.955363 4911 generic.go:334] "Generic (PLEG): container finished" podID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerID="7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7" exitCode=0 Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.955428 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5vx" event={"ID":"41a860a8-7190-4879-b159-7aaf49a4a9d8","Type":"ContainerDied","Data":"7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7"} Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.955473 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5vx" event={"ID":"41a860a8-7190-4879-b159-7aaf49a4a9d8","Type":"ContainerDied","Data":"c5a3010558e4eeb61df03bce1cddb6f7d9188604a9506747e309dc863ae56ac3"} Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.955499 4911 scope.go:117] "RemoveContainer" containerID="7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.955519 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs5vx" Mar 10 14:57:37 crc kubenswrapper[4911]: I0310 14:57:37.984820 4911 scope.go:117] "RemoveContainer" containerID="1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a" Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.000427 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs5vx"] Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.010434 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bs5vx"] Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.035401 4911 scope.go:117] "RemoveContainer" containerID="e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17" Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.082391 4911 scope.go:117] "RemoveContainer" containerID="7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7" Mar 10 14:57:38 crc kubenswrapper[4911]: E0310 14:57:38.083062 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7\": container with ID starting with 7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7 not found: ID does not exist" containerID="7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7" Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.083113 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7"} err="failed to get container status \"7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7\": rpc error: code = NotFound desc = could not find container \"7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7\": container with ID starting with 7a41b850a439ee35ca6ba30992e4c5c3515f7b1cae43faf3580287b30b9c75c7 not found: ID does not exist" Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.083150 4911 scope.go:117] "RemoveContainer" containerID="1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a" Mar 10 14:57:38 crc kubenswrapper[4911]: E0310 14:57:38.083695 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a\": container with ID starting with 1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a not found: ID does not exist" containerID="1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a" Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.083771 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a"} err="failed to get container status \"1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a\": rpc error: code = NotFound desc = could not find container \"1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a\": container with ID starting with 1e16d6a6011099c3f980f44106449d6ea5760383c8378635aa76609ad000015a not found: ID does not exist" Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.083817 4911 scope.go:117] "RemoveContainer" containerID="e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17" Mar 10 14:57:38 crc kubenswrapper[4911]: E0310 14:57:38.084468 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17\": container with ID starting with e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17 not found: ID does not exist" containerID="e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17" Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.084500 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17"} err="failed to get container status \"e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17\": rpc error: code = NotFound desc = could not find container \"e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17\": container with ID starting with e87423e8826a8492fd8396cc690e2d0c067de5930f26c32a9d4839d928228c17 not found: ID does not exist" Mar 10 14:57:38 crc kubenswrapper[4911]: I0310 14:57:38.205441 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" path="/var/lib/kubelet/pods/41a860a8-7190-4879-b159-7aaf49a4a9d8/volumes" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.188151 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552578-j8hft"] Mar 10 14:58:00 crc kubenswrapper[4911]: E0310 14:58:00.189325 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="extract-content" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.189351 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="extract-content" Mar 10 14:58:00 crc kubenswrapper[4911]: E0310 14:58:00.189374 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="registry-server" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.189382 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="registry-server" Mar 10 14:58:00 crc kubenswrapper[4911]: E0310 14:58:00.189407 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="extract-utilities" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.189414 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="extract-utilities" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.189627 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a860a8-7190-4879-b159-7aaf49a4a9d8" containerName="registry-server" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.190368 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552578-j8hft" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.200569 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.200837 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.201116 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.206750 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552578-j8hft"] Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.359779 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78d2w\" (UniqueName: \"kubernetes.io/projected/31d142a4-12ea-429f-9755-388522ba6861-kube-api-access-78d2w\") pod \"auto-csr-approver-29552578-j8hft\" (UID: \"31d142a4-12ea-429f-9755-388522ba6861\") " pod="openshift-infra/auto-csr-approver-29552578-j8hft" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.461941 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78d2w\" (UniqueName: \"kubernetes.io/projected/31d142a4-12ea-429f-9755-388522ba6861-kube-api-access-78d2w\") pod \"auto-csr-approver-29552578-j8hft\" (UID: \"31d142a4-12ea-429f-9755-388522ba6861\") " pod="openshift-infra/auto-csr-approver-29552578-j8hft" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.481560 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78d2w\" (UniqueName: \"kubernetes.io/projected/31d142a4-12ea-429f-9755-388522ba6861-kube-api-access-78d2w\") pod \"auto-csr-approver-29552578-j8hft\" (UID: \"31d142a4-12ea-429f-9755-388522ba6861\") " pod="openshift-infra/auto-csr-approver-29552578-j8hft" Mar 10 14:58:00 crc kubenswrapper[4911]: I0310 14:58:00.518218 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552578-j8hft" Mar 10 14:58:01 crc kubenswrapper[4911]: I0310 14:58:01.025618 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552578-j8hft"] Mar 10 14:58:01 crc kubenswrapper[4911]: I0310 14:58:01.214545 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552578-j8hft" event={"ID":"31d142a4-12ea-429f-9755-388522ba6861","Type":"ContainerStarted","Data":"8042fcec0f19029381daa795ad3326b0ba4442808ac75ddf15d446b36d35a4d9"} Mar 10 14:58:03 crc kubenswrapper[4911]: I0310 14:58:03.235616 4911 generic.go:334] "Generic (PLEG): container finished" podID="31d142a4-12ea-429f-9755-388522ba6861" containerID="a2bbf2cc93217f0adeedde369d3dcb68c1b646d5a6d3b9e763f261262edc67ea" exitCode=0 Mar 10 14:58:03 crc kubenswrapper[4911]: I0310 14:58:03.236244 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552578-j8hft" event={"ID":"31d142a4-12ea-429f-9755-388522ba6861","Type":"ContainerDied","Data":"a2bbf2cc93217f0adeedde369d3dcb68c1b646d5a6d3b9e763f261262edc67ea"} Mar 10 14:58:04 crc kubenswrapper[4911]: I0310 14:58:04.720627 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552578-j8hft" Mar 10 14:58:04 crc kubenswrapper[4911]: I0310 14:58:04.868674 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78d2w\" (UniqueName: \"kubernetes.io/projected/31d142a4-12ea-429f-9755-388522ba6861-kube-api-access-78d2w\") pod \"31d142a4-12ea-429f-9755-388522ba6861\" (UID: \"31d142a4-12ea-429f-9755-388522ba6861\") " Mar 10 14:58:04 crc kubenswrapper[4911]: I0310 14:58:04.876333 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d142a4-12ea-429f-9755-388522ba6861-kube-api-access-78d2w" (OuterVolumeSpecName: "kube-api-access-78d2w") pod "31d142a4-12ea-429f-9755-388522ba6861" (UID: "31d142a4-12ea-429f-9755-388522ba6861"). InnerVolumeSpecName "kube-api-access-78d2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:58:04 crc kubenswrapper[4911]: I0310 14:58:04.972627 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78d2w\" (UniqueName: \"kubernetes.io/projected/31d142a4-12ea-429f-9755-388522ba6861-kube-api-access-78d2w\") on node \"crc\" DevicePath \"\"" Mar 10 14:58:05 crc kubenswrapper[4911]: I0310 14:58:05.259495 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552578-j8hft" event={"ID":"31d142a4-12ea-429f-9755-388522ba6861","Type":"ContainerDied","Data":"8042fcec0f19029381daa795ad3326b0ba4442808ac75ddf15d446b36d35a4d9"} Mar 10 14:58:05 crc kubenswrapper[4911]: I0310 14:58:05.259561 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8042fcec0f19029381daa795ad3326b0ba4442808ac75ddf15d446b36d35a4d9" Mar 10 14:58:05 crc kubenswrapper[4911]: I0310 14:58:05.259537 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552578-j8hft" Mar 10 14:58:05 crc kubenswrapper[4911]: I0310 14:58:05.799850 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552572-r952k"] Mar 10 14:58:05 crc kubenswrapper[4911]: I0310 14:58:05.810232 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552572-r952k"] Mar 10 14:58:06 crc kubenswrapper[4911]: I0310 14:58:06.227955 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe" path="/var/lib/kubelet/pods/5e92d98f-dbe7-4eab-be88-a12f6f2d2bbe/volumes" Mar 10 14:58:48 crc kubenswrapper[4911]: I0310 14:58:48.531169 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:58:48 crc kubenswrapper[4911]: I0310 14:58:48.531652 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:58:57 crc kubenswrapper[4911]: I0310 14:58:57.382678 4911 scope.go:117] "RemoveContainer" containerID="dd7d2c6531c590bb40a27ef0f1c5ff5ee2086c3a826a5605686ae5f26c74825e" Mar 10 14:59:18 crc kubenswrapper[4911]: I0310 14:59:18.521209 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:59:18 crc kubenswrapper[4911]: I0310 14:59:18.521895 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.497462 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cm6qq"] Mar 10 14:59:35 crc kubenswrapper[4911]: E0310 14:59:35.498805 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d142a4-12ea-429f-9755-388522ba6861" containerName="oc" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.498828 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d142a4-12ea-429f-9755-388522ba6861" containerName="oc" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.499129 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d142a4-12ea-429f-9755-388522ba6861" containerName="oc" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.506258 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.515652 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cm6qq"] Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.535697 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-catalog-content\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.535783 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75hgj\" (UniqueName: \"kubernetes.io/projected/284fe168-872f-4db0-860c-093ef6c2fc3f-kube-api-access-75hgj\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.535910 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-utilities\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.637846 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-utilities\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.637994 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-catalog-content\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.638049 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75hgj\" (UniqueName: \"kubernetes.io/projected/284fe168-872f-4db0-860c-093ef6c2fc3f-kube-api-access-75hgj\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.638475 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-utilities\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.638551 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-catalog-content\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.662517 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75hgj\" (UniqueName: \"kubernetes.io/projected/284fe168-872f-4db0-860c-093ef6c2fc3f-kube-api-access-75hgj\") pod \"redhat-marketplace-cm6qq\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:35 crc kubenswrapper[4911]: I0310 14:59:35.835393 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:36 crc kubenswrapper[4911]: I0310 14:59:36.349814 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cm6qq"] Mar 10 14:59:37 crc kubenswrapper[4911]: I0310 14:59:37.245317 4911 generic.go:334] "Generic (PLEG): container finished" podID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerID="c520999819d4be94e81fcfa1fbb67ba27aa8c1c83fe5cc1d61949510953c93a9" exitCode=0 Mar 10 14:59:37 crc kubenswrapper[4911]: I0310 14:59:37.245795 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cm6qq" event={"ID":"284fe168-872f-4db0-860c-093ef6c2fc3f","Type":"ContainerDied","Data":"c520999819d4be94e81fcfa1fbb67ba27aa8c1c83fe5cc1d61949510953c93a9"} Mar 10 14:59:37 crc kubenswrapper[4911]: I0310 14:59:37.245836 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cm6qq" event={"ID":"284fe168-872f-4db0-860c-093ef6c2fc3f","Type":"ContainerStarted","Data":"e75aa6d9e4e9e3f70e10203ed24a9fe9c638e336e3e781b28a4f4e38e37c91ea"} Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.496029 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d92jx"] Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.499995 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.515233 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d92jx"] Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.606973 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-catalog-content\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.607081 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mhwp\" (UniqueName: \"kubernetes.io/projected/8a544872-0659-48aa-90ba-924862223c60-kube-api-access-4mhwp\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.607141 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-utilities\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.708785 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-catalog-content\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.708863 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mhwp\" (UniqueName: \"kubernetes.io/projected/8a544872-0659-48aa-90ba-924862223c60-kube-api-access-4mhwp\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.708913 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-utilities\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.709510 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-utilities\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.709799 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-catalog-content\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.729262 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mhwp\" (UniqueName: \"kubernetes.io/projected/8a544872-0659-48aa-90ba-924862223c60-kube-api-access-4mhwp\") pod \"community-operators-d92jx\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:38 crc kubenswrapper[4911]: I0310 14:59:38.832022 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:39 crc kubenswrapper[4911]: I0310 14:59:39.271657 4911 generic.go:334] "Generic (PLEG): container finished" podID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerID="13e40283ed04b45bb1c5d4da493bc14364e31011eb019c94ac03dbe7e91eedac" exitCode=0 Mar 10 14:59:39 crc kubenswrapper[4911]: I0310 14:59:39.271793 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cm6qq" event={"ID":"284fe168-872f-4db0-860c-093ef6c2fc3f","Type":"ContainerDied","Data":"13e40283ed04b45bb1c5d4da493bc14364e31011eb019c94ac03dbe7e91eedac"} Mar 10 14:59:39 crc kubenswrapper[4911]: W0310 14:59:39.413583 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a544872_0659_48aa_90ba_924862223c60.slice/crio-276c4de196c83b40729ee406fda72d5a704705dfe00c8981d7c55714a90ccb8c WatchSource:0}: Error finding container 276c4de196c83b40729ee406fda72d5a704705dfe00c8981d7c55714a90ccb8c: Status 404 returned error can't find the container with id 276c4de196c83b40729ee406fda72d5a704705dfe00c8981d7c55714a90ccb8c Mar 10 14:59:39 crc kubenswrapper[4911]: I0310 14:59:39.421080 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d92jx"] Mar 10 14:59:40 crc kubenswrapper[4911]: I0310 14:59:40.287083 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cm6qq" event={"ID":"284fe168-872f-4db0-860c-093ef6c2fc3f","Type":"ContainerStarted","Data":"a28c1c844a03efbbf4847cb5698e2eab234172bd2b1fcccb3daa7aaebfbb6e3c"} Mar 10 14:59:40 crc kubenswrapper[4911]: I0310 14:59:40.291496 4911 generic.go:334] "Generic (PLEG): container finished" podID="8a544872-0659-48aa-90ba-924862223c60" containerID="f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e" exitCode=0 Mar 10 14:59:40 crc kubenswrapper[4911]: I0310 14:59:40.291574 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d92jx" event={"ID":"8a544872-0659-48aa-90ba-924862223c60","Type":"ContainerDied","Data":"f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e"} Mar 10 14:59:40 crc kubenswrapper[4911]: I0310 14:59:40.291627 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d92jx" event={"ID":"8a544872-0659-48aa-90ba-924862223c60","Type":"ContainerStarted","Data":"276c4de196c83b40729ee406fda72d5a704705dfe00c8981d7c55714a90ccb8c"} Mar 10 14:59:40 crc kubenswrapper[4911]: I0310 14:59:40.315937 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cm6qq" podStartSLOduration=2.864718501 podStartE2EDuration="5.315905666s" podCreationTimestamp="2026-03-10 14:59:35 +0000 UTC" firstStartedPulling="2026-03-10 14:59:37.248558245 +0000 UTC m=+3481.812078162" lastFinishedPulling="2026-03-10 14:59:39.69974541 +0000 UTC m=+3484.263265327" observedRunningTime="2026-03-10 14:59:40.311054675 +0000 UTC m=+3484.874574592" watchObservedRunningTime="2026-03-10 14:59:40.315905666 +0000 UTC m=+3484.879425583" Mar 10 14:59:41 crc kubenswrapper[4911]: I0310 14:59:41.322713 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d92jx" event={"ID":"8a544872-0659-48aa-90ba-924862223c60","Type":"ContainerStarted","Data":"ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3"} Mar 10 14:59:42 crc kubenswrapper[4911]: I0310 14:59:42.335563 4911 generic.go:334] "Generic (PLEG): container finished" podID="8a544872-0659-48aa-90ba-924862223c60" containerID="ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3" exitCode=0 Mar 10 14:59:42 crc kubenswrapper[4911]: I0310 14:59:42.335692 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d92jx" event={"ID":"8a544872-0659-48aa-90ba-924862223c60","Type":"ContainerDied","Data":"ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3"} Mar 10 14:59:44 crc kubenswrapper[4911]: I0310 14:59:44.359803 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d92jx" event={"ID":"8a544872-0659-48aa-90ba-924862223c60","Type":"ContainerStarted","Data":"5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43"} Mar 10 14:59:44 crc kubenswrapper[4911]: I0310 14:59:44.393232 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d92jx" podStartSLOduration=3.557244162 podStartE2EDuration="6.393191749s" podCreationTimestamp="2026-03-10 14:59:38 +0000 UTC" firstStartedPulling="2026-03-10 14:59:40.294616423 +0000 UTC m=+3484.858136340" lastFinishedPulling="2026-03-10 14:59:43.130564 +0000 UTC m=+3487.694083927" observedRunningTime="2026-03-10 14:59:44.380318833 +0000 UTC m=+3488.943838750" watchObservedRunningTime="2026-03-10 14:59:44.393191749 +0000 UTC m=+3488.956711686" Mar 10 14:59:45 crc kubenswrapper[4911]: I0310 14:59:45.836445 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:45 crc kubenswrapper[4911]: I0310 14:59:45.837137 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:45 crc kubenswrapper[4911]: I0310 14:59:45.882822 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:46 crc kubenswrapper[4911]: I0310 14:59:46.426158 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:48 crc kubenswrapper[4911]: I0310 14:59:48.520812 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 14:59:48 crc kubenswrapper[4911]: I0310 14:59:48.521200 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 14:59:48 crc kubenswrapper[4911]: I0310 14:59:48.521270 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 14:59:48 crc kubenswrapper[4911]: I0310 14:59:48.522374 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19ad5193c0bc4ed506cb6042cc3e642a139d1b08dd49628d1a8a9e56568fa86f"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 14:59:48 crc kubenswrapper[4911]: I0310 14:59:48.522443 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://19ad5193c0bc4ed506cb6042cc3e642a139d1b08dd49628d1a8a9e56568fa86f" gracePeriod=600 Mar 10 14:59:48 crc kubenswrapper[4911]: I0310 14:59:48.832284 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:48 crc kubenswrapper[4911]: I0310 14:59:48.832802 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:48 crc kubenswrapper[4911]: I0310 14:59:48.898539 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.086553 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cm6qq"] Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.086880 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cm6qq" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerName="registry-server" containerID="cri-o://a28c1c844a03efbbf4847cb5698e2eab234172bd2b1fcccb3daa7aaebfbb6e3c" gracePeriod=2 Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.421252 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="19ad5193c0bc4ed506cb6042cc3e642a139d1b08dd49628d1a8a9e56568fa86f" exitCode=0 Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.421321 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"19ad5193c0bc4ed506cb6042cc3e642a139d1b08dd49628d1a8a9e56568fa86f"} Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.421698 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848"} Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.421724 4911 scope.go:117] "RemoveContainer" containerID="b53ddb2c7e2d03b02e242ccc67ef0f78be60f81c633750764330dcb64252ba7b" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.429180 4911 generic.go:334] "Generic (PLEG): container finished" podID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerID="a28c1c844a03efbbf4847cb5698e2eab234172bd2b1fcccb3daa7aaebfbb6e3c" exitCode=0 Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.429242 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cm6qq" event={"ID":"284fe168-872f-4db0-860c-093ef6c2fc3f","Type":"ContainerDied","Data":"a28c1c844a03efbbf4847cb5698e2eab234172bd2b1fcccb3daa7aaebfbb6e3c"} Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.490447 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.656469 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.754503 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75hgj\" (UniqueName: \"kubernetes.io/projected/284fe168-872f-4db0-860c-093ef6c2fc3f-kube-api-access-75hgj\") pod \"284fe168-872f-4db0-860c-093ef6c2fc3f\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.754627 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-catalog-content\") pod \"284fe168-872f-4db0-860c-093ef6c2fc3f\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.754713 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-utilities\") pod \"284fe168-872f-4db0-860c-093ef6c2fc3f\" (UID: \"284fe168-872f-4db0-860c-093ef6c2fc3f\") " Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.756079 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-utilities" (OuterVolumeSpecName: "utilities") pod "284fe168-872f-4db0-860c-093ef6c2fc3f" (UID: "284fe168-872f-4db0-860c-093ef6c2fc3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.762011 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284fe168-872f-4db0-860c-093ef6c2fc3f-kube-api-access-75hgj" (OuterVolumeSpecName: "kube-api-access-75hgj") pod "284fe168-872f-4db0-860c-093ef6c2fc3f" (UID: "284fe168-872f-4db0-860c-093ef6c2fc3f"). InnerVolumeSpecName "kube-api-access-75hgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.784296 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "284fe168-872f-4db0-860c-093ef6c2fc3f" (UID: "284fe168-872f-4db0-860c-093ef6c2fc3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.857667 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75hgj\" (UniqueName: \"kubernetes.io/projected/284fe168-872f-4db0-860c-093ef6c2fc3f-kube-api-access-75hgj\") on node \"crc\" DevicePath \"\"" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.857716 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:59:49 crc kubenswrapper[4911]: I0310 14:59:49.857750 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284fe168-872f-4db0-860c-093ef6c2fc3f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:59:50 crc kubenswrapper[4911]: I0310 14:59:50.449896 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cm6qq" Mar 10 14:59:50 crc kubenswrapper[4911]: I0310 14:59:50.449906 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cm6qq" event={"ID":"284fe168-872f-4db0-860c-093ef6c2fc3f","Type":"ContainerDied","Data":"e75aa6d9e4e9e3f70e10203ed24a9fe9c638e336e3e781b28a4f4e38e37c91ea"} Mar 10 14:59:50 crc kubenswrapper[4911]: I0310 14:59:50.450026 4911 scope.go:117] "RemoveContainer" containerID="a28c1c844a03efbbf4847cb5698e2eab234172bd2b1fcccb3daa7aaebfbb6e3c" Mar 10 14:59:50 crc kubenswrapper[4911]: I0310 14:59:50.490346 4911 scope.go:117] "RemoveContainer" containerID="13e40283ed04b45bb1c5d4da493bc14364e31011eb019c94ac03dbe7e91eedac" Mar 10 14:59:50 crc kubenswrapper[4911]: I0310 14:59:50.493066 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cm6qq"] Mar 10 14:59:50 crc kubenswrapper[4911]: I0310 14:59:50.506333 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cm6qq"] Mar 10 14:59:50 crc kubenswrapper[4911]: I0310 14:59:50.514816 4911 scope.go:117] "RemoveContainer" containerID="c520999819d4be94e81fcfa1fbb67ba27aa8c1c83fe5cc1d61949510953c93a9" Mar 10 14:59:51 crc kubenswrapper[4911]: I0310 14:59:51.282440 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d92jx"] Mar 10 14:59:51 crc kubenswrapper[4911]: I0310 14:59:51.469693 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d92jx" podUID="8a544872-0659-48aa-90ba-924862223c60" containerName="registry-server" containerID="cri-o://5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43" gracePeriod=2 Mar 10 14:59:51 crc kubenswrapper[4911]: I0310 14:59:51.985791 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.111865 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-utilities\") pod \"8a544872-0659-48aa-90ba-924862223c60\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.112204 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mhwp\" (UniqueName: \"kubernetes.io/projected/8a544872-0659-48aa-90ba-924862223c60-kube-api-access-4mhwp\") pod \"8a544872-0659-48aa-90ba-924862223c60\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.112434 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-catalog-content\") pod \"8a544872-0659-48aa-90ba-924862223c60\" (UID: \"8a544872-0659-48aa-90ba-924862223c60\") " Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.114097 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-utilities" (OuterVolumeSpecName: "utilities") pod "8a544872-0659-48aa-90ba-924862223c60" (UID: "8a544872-0659-48aa-90ba-924862223c60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.121223 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a544872-0659-48aa-90ba-924862223c60-kube-api-access-4mhwp" (OuterVolumeSpecName: "kube-api-access-4mhwp") pod "8a544872-0659-48aa-90ba-924862223c60" (UID: "8a544872-0659-48aa-90ba-924862223c60"). InnerVolumeSpecName "kube-api-access-4mhwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.167531 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a544872-0659-48aa-90ba-924862223c60" (UID: "8a544872-0659-48aa-90ba-924862223c60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.205988 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" path="/var/lib/kubelet/pods/284fe168-872f-4db0-860c-093ef6c2fc3f/volumes" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.214541 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.214579 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a544872-0659-48aa-90ba-924862223c60-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.214592 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mhwp\" (UniqueName: \"kubernetes.io/projected/8a544872-0659-48aa-90ba-924862223c60-kube-api-access-4mhwp\") on node \"crc\" DevicePath \"\"" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.483346 4911 generic.go:334] "Generic (PLEG): container finished" podID="8a544872-0659-48aa-90ba-924862223c60" containerID="5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43" exitCode=0 Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.483404 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d92jx" event={"ID":"8a544872-0659-48aa-90ba-924862223c60","Type":"ContainerDied","Data":"5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43"} Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.483494 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d92jx" event={"ID":"8a544872-0659-48aa-90ba-924862223c60","Type":"ContainerDied","Data":"276c4de196c83b40729ee406fda72d5a704705dfe00c8981d7c55714a90ccb8c"} Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.483491 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d92jx" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.483637 4911 scope.go:117] "RemoveContainer" containerID="5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.512850 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d92jx"] Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.516921 4911 scope.go:117] "RemoveContainer" containerID="ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.522978 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d92jx"] Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.542634 4911 scope.go:117] "RemoveContainer" containerID="f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.603792 4911 scope.go:117] "RemoveContainer" containerID="5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43" Mar 10 14:59:52 crc kubenswrapper[4911]: E0310 14:59:52.604505 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43\": container with ID starting with 5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43 not found: ID does not exist" containerID="5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.604595 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43"} err="failed to get container status \"5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43\": rpc error: code = NotFound desc = could not find container \"5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43\": container with ID starting with 5aeb343e1981e091f6d74b4be45dc70df0ebff5a563d33fe2d10b492f4decc43 not found: ID does not exist" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.604780 4911 scope.go:117] "RemoveContainer" containerID="ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3" Mar 10 14:59:52 crc kubenswrapper[4911]: E0310 14:59:52.606772 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3\": container with ID starting with ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3 not found: ID does not exist" containerID="ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.606825 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3"} err="failed to get container status \"ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3\": rpc error: code = NotFound desc = could not find container \"ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3\": container with ID starting with ae758032ba79c2b2ac7db3dab9949387293e087566a5397be91f2f3325c07be3 not found: ID does not exist" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.606858 4911 scope.go:117] "RemoveContainer" containerID="f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e" Mar 10 14:59:52 crc kubenswrapper[4911]: E0310 14:59:52.607325 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e\": container with ID starting with f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e not found: ID does not exist" containerID="f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e" Mar 10 14:59:52 crc kubenswrapper[4911]: I0310 14:59:52.607347 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e"} err="failed to get container status \"f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e\": rpc error: code = NotFound desc = could not find container \"f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e\": container with ID starting with f18210cf8b1daf9d53bc0ae409b3aff0c33fa03f780e10852352a2686b0d705e not found: ID does not exist" Mar 10 14:59:54 crc kubenswrapper[4911]: I0310 14:59:54.206624 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a544872-0659-48aa-90ba-924862223c60" path="/var/lib/kubelet/pods/8a544872-0659-48aa-90ba-924862223c60/volumes" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.174713 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552580-h4647"] Mar 10 15:00:00 crc kubenswrapper[4911]: E0310 15:00:00.176057 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerName="registry-server" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.176078 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerName="registry-server" Mar 10 15:00:00 crc kubenswrapper[4911]: E0310 15:00:00.176091 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerName="extract-content" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.176100 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerName="extract-content" Mar 10 15:00:00 crc kubenswrapper[4911]: E0310 15:00:00.176133 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a544872-0659-48aa-90ba-924862223c60" containerName="extract-utilities" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.176142 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a544872-0659-48aa-90ba-924862223c60" containerName="extract-utilities" Mar 10 15:00:00 crc kubenswrapper[4911]: E0310 15:00:00.176154 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a544872-0659-48aa-90ba-924862223c60" containerName="extract-content" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.176163 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a544872-0659-48aa-90ba-924862223c60" containerName="extract-content" Mar 10 15:00:00 crc kubenswrapper[4911]: E0310 15:00:00.176187 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerName="extract-utilities" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.176195 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerName="extract-utilities" Mar 10 15:00:00 crc kubenswrapper[4911]: E0310 15:00:00.176221 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a544872-0659-48aa-90ba-924862223c60" containerName="registry-server" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.176229 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a544872-0659-48aa-90ba-924862223c60" containerName="registry-server" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.176457 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="284fe168-872f-4db0-860c-093ef6c2fc3f" containerName="registry-server" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.176485 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a544872-0659-48aa-90ba-924862223c60" containerName="registry-server" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.177346 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552580-h4647" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.181127 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.181293 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.182010 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.215428 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552580-h4647"] Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.266586 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs"] Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.268396 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.271553 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.271986 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.277330 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs"] Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.309889 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p92q\" (UniqueName: \"kubernetes.io/projected/766a5fea-e025-4ad7-94aa-037143ddf8a3-kube-api-access-6p92q\") pod \"auto-csr-approver-29552580-h4647\" (UID: \"766a5fea-e025-4ad7-94aa-037143ddf8a3\") " pod="openshift-infra/auto-csr-approver-29552580-h4647" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.412744 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d9f1edb-0da4-4697-b672-09f1738dd70f-config-volume\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.412843 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d9f1edb-0da4-4697-b672-09f1738dd70f-secret-volume\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.412879 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pff7v\" (UniqueName: \"kubernetes.io/projected/8d9f1edb-0da4-4697-b672-09f1738dd70f-kube-api-access-pff7v\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.412924 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p92q\" (UniqueName: \"kubernetes.io/projected/766a5fea-e025-4ad7-94aa-037143ddf8a3-kube-api-access-6p92q\") pod \"auto-csr-approver-29552580-h4647\" (UID: \"766a5fea-e025-4ad7-94aa-037143ddf8a3\") " pod="openshift-infra/auto-csr-approver-29552580-h4647" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.437489 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p92q\" (UniqueName: \"kubernetes.io/projected/766a5fea-e025-4ad7-94aa-037143ddf8a3-kube-api-access-6p92q\") pod \"auto-csr-approver-29552580-h4647\" (UID: \"766a5fea-e025-4ad7-94aa-037143ddf8a3\") " pod="openshift-infra/auto-csr-approver-29552580-h4647" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.503132 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552580-h4647" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.515134 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d9f1edb-0da4-4697-b672-09f1738dd70f-config-volume\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.515254 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d9f1edb-0da4-4697-b672-09f1738dd70f-secret-volume\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.515291 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pff7v\" (UniqueName: \"kubernetes.io/projected/8d9f1edb-0da4-4697-b672-09f1738dd70f-kube-api-access-pff7v\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.516246 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d9f1edb-0da4-4697-b672-09f1738dd70f-config-volume\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.520675 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d9f1edb-0da4-4697-b672-09f1738dd70f-secret-volume\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.536839 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pff7v\" (UniqueName: \"kubernetes.io/projected/8d9f1edb-0da4-4697-b672-09f1738dd70f-kube-api-access-pff7v\") pod \"collect-profiles-29552580-f9wqs\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:00 crc kubenswrapper[4911]: I0310 15:00:00.626985 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:01 crc kubenswrapper[4911]: I0310 15:00:01.012808 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552580-h4647"] Mar 10 15:00:01 crc kubenswrapper[4911]: I0310 15:00:01.171103 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs"] Mar 10 15:00:01 crc kubenswrapper[4911]: W0310 15:00:01.181284 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d9f1edb_0da4_4697_b672_09f1738dd70f.slice/crio-bc9e9b44e34ccf3d2d3c85b3e34775fe88e65aeb5bbce4394c072c626ead6ca9 WatchSource:0}: Error finding container bc9e9b44e34ccf3d2d3c85b3e34775fe88e65aeb5bbce4394c072c626ead6ca9: Status 404 returned error can't find the container with id bc9e9b44e34ccf3d2d3c85b3e34775fe88e65aeb5bbce4394c072c626ead6ca9 Mar 10 15:00:01 crc kubenswrapper[4911]: I0310 15:00:01.574510 4911 generic.go:334] "Generic (PLEG): container finished" podID="8d9f1edb-0da4-4697-b672-09f1738dd70f" containerID="a8b28b7c164fe276472bb5009ca8c4e2b552c8d931c84e5fdcc681910ef57cf3" exitCode=0 Mar 10 15:00:01 crc kubenswrapper[4911]: I0310 15:00:01.574629 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" event={"ID":"8d9f1edb-0da4-4697-b672-09f1738dd70f","Type":"ContainerDied","Data":"a8b28b7c164fe276472bb5009ca8c4e2b552c8d931c84e5fdcc681910ef57cf3"} Mar 10 15:00:01 crc kubenswrapper[4911]: I0310 15:00:01.574675 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" event={"ID":"8d9f1edb-0da4-4697-b672-09f1738dd70f","Type":"ContainerStarted","Data":"bc9e9b44e34ccf3d2d3c85b3e34775fe88e65aeb5bbce4394c072c626ead6ca9"} Mar 10 15:00:01 crc kubenswrapper[4911]: I0310 15:00:01.577443 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552580-h4647" event={"ID":"766a5fea-e025-4ad7-94aa-037143ddf8a3","Type":"ContainerStarted","Data":"a83754a2c013a261fa5834546ae6fdfefefaea22a8a637fef580aebff18f5c82"} Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.017057 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.074565 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d9f1edb-0da4-4697-b672-09f1738dd70f-secret-volume\") pod \"8d9f1edb-0da4-4697-b672-09f1738dd70f\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.074698 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d9f1edb-0da4-4697-b672-09f1738dd70f-config-volume\") pod \"8d9f1edb-0da4-4697-b672-09f1738dd70f\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.074863 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pff7v\" (UniqueName: \"kubernetes.io/projected/8d9f1edb-0da4-4697-b672-09f1738dd70f-kube-api-access-pff7v\") pod \"8d9f1edb-0da4-4697-b672-09f1738dd70f\" (UID: \"8d9f1edb-0da4-4697-b672-09f1738dd70f\") " Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.075966 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d9f1edb-0da4-4697-b672-09f1738dd70f-config-volume" (OuterVolumeSpecName: "config-volume") pod "8d9f1edb-0da4-4697-b672-09f1738dd70f" (UID: "8d9f1edb-0da4-4697-b672-09f1738dd70f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.082974 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9f1edb-0da4-4697-b672-09f1738dd70f-kube-api-access-pff7v" (OuterVolumeSpecName: "kube-api-access-pff7v") pod "8d9f1edb-0da4-4697-b672-09f1738dd70f" (UID: "8d9f1edb-0da4-4697-b672-09f1738dd70f"). InnerVolumeSpecName "kube-api-access-pff7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.084937 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9f1edb-0da4-4697-b672-09f1738dd70f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8d9f1edb-0da4-4697-b672-09f1738dd70f" (UID: "8d9f1edb-0da4-4697-b672-09f1738dd70f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.179609 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d9f1edb-0da4-4697-b672-09f1738dd70f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.179682 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pff7v\" (UniqueName: \"kubernetes.io/projected/8d9f1edb-0da4-4697-b672-09f1738dd70f-kube-api-access-pff7v\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.179703 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8d9f1edb-0da4-4697-b672-09f1738dd70f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.602827 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" event={"ID":"8d9f1edb-0da4-4697-b672-09f1738dd70f","Type":"ContainerDied","Data":"bc9e9b44e34ccf3d2d3c85b3e34775fe88e65aeb5bbce4394c072c626ead6ca9"} Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.602881 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc9e9b44e34ccf3d2d3c85b3e34775fe88e65aeb5bbce4394c072c626ead6ca9" Mar 10 15:00:03 crc kubenswrapper[4911]: I0310 15:00:03.602898 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-f9wqs" Mar 10 15:00:04 crc kubenswrapper[4911]: I0310 15:00:04.130173 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb"] Mar 10 15:00:04 crc kubenswrapper[4911]: I0310 15:00:04.142892 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552535-69fkb"] Mar 10 15:00:04 crc kubenswrapper[4911]: I0310 15:00:04.212548 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29a4b10-f12c-47df-9061-c208883696aa" path="/var/lib/kubelet/pods/f29a4b10-f12c-47df-9061-c208883696aa/volumes" Mar 10 15:00:05 crc kubenswrapper[4911]: I0310 15:00:05.634300 4911 generic.go:334] "Generic (PLEG): container finished" podID="766a5fea-e025-4ad7-94aa-037143ddf8a3" containerID="ce855d5066d81124ceb3d1e96738cc58ceec7cc7a3f348d260388fd082163256" exitCode=0 Mar 10 15:00:05 crc kubenswrapper[4911]: I0310 15:00:05.634387 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552580-h4647" event={"ID":"766a5fea-e025-4ad7-94aa-037143ddf8a3","Type":"ContainerDied","Data":"ce855d5066d81124ceb3d1e96738cc58ceec7cc7a3f348d260388fd082163256"} Mar 10 15:00:07 crc kubenswrapper[4911]: I0310 15:00:07.025694 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552580-h4647" Mar 10 15:00:07 crc kubenswrapper[4911]: I0310 15:00:07.067300 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p92q\" (UniqueName: \"kubernetes.io/projected/766a5fea-e025-4ad7-94aa-037143ddf8a3-kube-api-access-6p92q\") pod \"766a5fea-e025-4ad7-94aa-037143ddf8a3\" (UID: \"766a5fea-e025-4ad7-94aa-037143ddf8a3\") " Mar 10 15:00:07 crc kubenswrapper[4911]: I0310 15:00:07.078075 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/766a5fea-e025-4ad7-94aa-037143ddf8a3-kube-api-access-6p92q" (OuterVolumeSpecName: "kube-api-access-6p92q") pod "766a5fea-e025-4ad7-94aa-037143ddf8a3" (UID: "766a5fea-e025-4ad7-94aa-037143ddf8a3"). InnerVolumeSpecName "kube-api-access-6p92q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:00:07 crc kubenswrapper[4911]: I0310 15:00:07.170564 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p92q\" (UniqueName: \"kubernetes.io/projected/766a5fea-e025-4ad7-94aa-037143ddf8a3-kube-api-access-6p92q\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:07 crc kubenswrapper[4911]: I0310 15:00:07.656037 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552580-h4647" event={"ID":"766a5fea-e025-4ad7-94aa-037143ddf8a3","Type":"ContainerDied","Data":"a83754a2c013a261fa5834546ae6fdfefefaea22a8a637fef580aebff18f5c82"} Mar 10 15:00:07 crc kubenswrapper[4911]: I0310 15:00:07.656106 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83754a2c013a261fa5834546ae6fdfefefaea22a8a637fef580aebff18f5c82" Mar 10 15:00:07 crc kubenswrapper[4911]: I0310 15:00:07.656135 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552580-h4647" Mar 10 15:00:08 crc kubenswrapper[4911]: I0310 15:00:08.095698 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552574-j4t7l"] Mar 10 15:00:08 crc kubenswrapper[4911]: I0310 15:00:08.104654 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552574-j4t7l"] Mar 10 15:00:08 crc kubenswrapper[4911]: I0310 15:00:08.204966 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="998046ee-be1e-4a06-a0f5-91f9c808b53f" path="/var/lib/kubelet/pods/998046ee-be1e-4a06-a0f5-91f9c808b53f/volumes" Mar 10 15:00:55 crc kubenswrapper[4911]: I0310 15:00:55.152265 4911 generic.go:334] "Generic (PLEG): container finished" podID="c6a78318-420c-43fe-98f3-9306e18ee2d4" containerID="dfa323fb63696c5eb6e33a9eb4b0973f727d796e2b864d596496753324281a3c" exitCode=0 Mar 10 15:00:55 crc kubenswrapper[4911]: I0310 15:00:55.152383 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c6a78318-420c-43fe-98f3-9306e18ee2d4","Type":"ContainerDied","Data":"dfa323fb63696c5eb6e33a9eb4b0973f727d796e2b864d596496753324281a3c"} Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.524501 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.626602 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config-secret\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.627038 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ca-certs\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.627139 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ssh-key\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.627293 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-workdir\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.627360 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.627396 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-temporary\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.627453 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-config-data\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.627528 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.627574 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9c7v\" (UniqueName: \"kubernetes.io/projected/c6a78318-420c-43fe-98f3-9306e18ee2d4-kube-api-access-c9c7v\") pod \"c6a78318-420c-43fe-98f3-9306e18ee2d4\" (UID: \"c6a78318-420c-43fe-98f3-9306e18ee2d4\") " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.628054 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.628310 4911 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.628370 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-config-data" (OuterVolumeSpecName: "config-data") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.633844 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.633951 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a78318-420c-43fe-98f3-9306e18ee2d4-kube-api-access-c9c7v" (OuterVolumeSpecName: "kube-api-access-c9c7v") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "kube-api-access-c9c7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.634752 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.659199 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.661087 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.662130 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.679421 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c6a78318-420c-43fe-98f3-9306e18ee2d4" (UID: "c6a78318-420c-43fe-98f3-9306e18ee2d4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.730890 4911 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.730936 4911 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c6a78318-420c-43fe-98f3-9306e18ee2d4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.731022 4911 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.731043 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.731056 4911 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.731071 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9c7v\" (UniqueName: \"kubernetes.io/projected/c6a78318-420c-43fe-98f3-9306e18ee2d4-kube-api-access-c9c7v\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.731088 4911 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.731102 4911 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c6a78318-420c-43fe-98f3-9306e18ee2d4-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.754598 4911 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 15:00:56 crc kubenswrapper[4911]: I0310 15:00:56.833619 4911 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:00:57 crc kubenswrapper[4911]: I0310 15:00:57.174969 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c6a78318-420c-43fe-98f3-9306e18ee2d4","Type":"ContainerDied","Data":"cec02596dce5cb5161a23e8d1e4e02523ba0de43c56ed8f10e5ccdc8baa804cb"} Mar 10 15:00:57 crc kubenswrapper[4911]: I0310 15:00:57.175022 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cec02596dce5cb5161a23e8d1e4e02523ba0de43c56ed8f10e5ccdc8baa804cb" Mar 10 15:00:57 crc kubenswrapper[4911]: I0310 15:00:57.175101 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 15:00:57 crc kubenswrapper[4911]: I0310 15:00:57.517394 4911 scope.go:117] "RemoveContainer" containerID="a8a3061e1682af79f2d778689fa9ff66eb1275dcb7caab2f3649ac3aef929731" Mar 10 15:00:57 crc kubenswrapper[4911]: I0310 15:00:57.552991 4911 scope.go:117] "RemoveContainer" containerID="79c5f9eb6d80cff3d980466a6c90591a2a0f50f81bd35c2f259eb4d072d6c8a2" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.781532 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 15:00:59 crc kubenswrapper[4911]: E0310 15:00:59.782674 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766a5fea-e025-4ad7-94aa-037143ddf8a3" containerName="oc" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.782692 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="766a5fea-e025-4ad7-94aa-037143ddf8a3" containerName="oc" Mar 10 15:00:59 crc kubenswrapper[4911]: E0310 15:00:59.782717 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a78318-420c-43fe-98f3-9306e18ee2d4" containerName="tempest-tests-tempest-tests-runner" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.782779 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a78318-420c-43fe-98f3-9306e18ee2d4" containerName="tempest-tests-tempest-tests-runner" Mar 10 15:00:59 crc kubenswrapper[4911]: E0310 15:00:59.782793 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9f1edb-0da4-4697-b672-09f1738dd70f" containerName="collect-profiles" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.782801 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9f1edb-0da4-4697-b672-09f1738dd70f" containerName="collect-profiles" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.782998 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a78318-420c-43fe-98f3-9306e18ee2d4" containerName="tempest-tests-tempest-tests-runner" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.783024 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="766a5fea-e025-4ad7-94aa-037143ddf8a3" containerName="oc" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.783038 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9f1edb-0da4-4697-b672-09f1738dd70f" containerName="collect-profiles" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.783952 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.786464 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-k2nwb" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.790663 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.899382 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9bmf\" (UniqueName: \"kubernetes.io/projected/736e35e8-db53-456b-a374-50f70159f967-kube-api-access-p9bmf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"736e35e8-db53-456b-a374-50f70159f967\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:00:59 crc kubenswrapper[4911]: I0310 15:00:59.899784 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"736e35e8-db53-456b-a374-50f70159f967\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.001801 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9bmf\" (UniqueName: \"kubernetes.io/projected/736e35e8-db53-456b-a374-50f70159f967-kube-api-access-p9bmf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"736e35e8-db53-456b-a374-50f70159f967\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.002342 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"736e35e8-db53-456b-a374-50f70159f967\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.003566 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"736e35e8-db53-456b-a374-50f70159f967\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.030138 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9bmf\" (UniqueName: \"kubernetes.io/projected/736e35e8-db53-456b-a374-50f70159f967-kube-api-access-p9bmf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"736e35e8-db53-456b-a374-50f70159f967\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.040487 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"736e35e8-db53-456b-a374-50f70159f967\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.115219 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.153397 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29552581-nh5lq"] Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.154955 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.165023 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552581-nh5lq"] Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.220736 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftgw\" (UniqueName: \"kubernetes.io/projected/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-kube-api-access-vftgw\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.220794 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-fernet-keys\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.220911 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-config-data\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.220938 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-combined-ca-bundle\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.323190 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-fernet-keys\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.323255 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftgw\" (UniqueName: \"kubernetes.io/projected/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-kube-api-access-vftgw\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.323387 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-config-data\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.323404 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-combined-ca-bundle\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.334172 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-combined-ca-bundle\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.337592 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-fernet-keys\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.349881 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-config-data\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.388829 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftgw\" (UniqueName: \"kubernetes.io/projected/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-kube-api-access-vftgw\") pod \"keystone-cron-29552581-nh5lq\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.566574 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:00 crc kubenswrapper[4911]: I0310 15:01:00.796612 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 15:01:01 crc kubenswrapper[4911]: I0310 15:01:01.118679 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552581-nh5lq"] Mar 10 15:01:01 crc kubenswrapper[4911]: I0310 15:01:01.245900 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"736e35e8-db53-456b-a374-50f70159f967","Type":"ContainerStarted","Data":"a7862ff21666e468a34ed31c4d994f003ab4e027f18eb18f546e08bf40c0d1e2"} Mar 10 15:01:01 crc kubenswrapper[4911]: I0310 15:01:01.248593 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552581-nh5lq" event={"ID":"7a04a21f-13ba-40c3-89ea-a4e87b1fec25","Type":"ContainerStarted","Data":"705822bd3ac8332bd8f49326b3c37b548f75a07fa22df2ff170bc169a5878440"} Mar 10 15:01:02 crc kubenswrapper[4911]: I0310 15:01:02.260353 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"736e35e8-db53-456b-a374-50f70159f967","Type":"ContainerStarted","Data":"56b84ba10d6beb253cdd8b7756e39ce3ed33fa7db8de01192524e5c286c9f510"} Mar 10 15:01:02 crc kubenswrapper[4911]: I0310 15:01:02.271289 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552581-nh5lq" event={"ID":"7a04a21f-13ba-40c3-89ea-a4e87b1fec25","Type":"ContainerStarted","Data":"5ac578303932324e9254597d09ef9875a9dec03cbd2bc65ba46d24d28e7d4d0f"} Mar 10 15:01:02 crc kubenswrapper[4911]: I0310 15:01:02.281080 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.506310146 podStartE2EDuration="3.28105338s" podCreationTimestamp="2026-03-10 15:00:59 +0000 UTC" firstStartedPulling="2026-03-10 15:01:00.810237439 +0000 UTC m=+3565.373757356" lastFinishedPulling="2026-03-10 15:01:01.584980673 +0000 UTC m=+3566.148500590" observedRunningTime="2026-03-10 15:01:02.276791456 +0000 UTC m=+3566.840311393" watchObservedRunningTime="2026-03-10 15:01:02.28105338 +0000 UTC m=+3566.844573297" Mar 10 15:01:02 crc kubenswrapper[4911]: I0310 15:01:02.314338 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29552581-nh5lq" podStartSLOduration=2.314315775 podStartE2EDuration="2.314315775s" podCreationTimestamp="2026-03-10 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:01:02.308425877 +0000 UTC m=+3566.871945794" watchObservedRunningTime="2026-03-10 15:01:02.314315775 +0000 UTC m=+3566.877835692" Mar 10 15:01:04 crc kubenswrapper[4911]: I0310 15:01:04.297472 4911 generic.go:334] "Generic (PLEG): container finished" podID="7a04a21f-13ba-40c3-89ea-a4e87b1fec25" containerID="5ac578303932324e9254597d09ef9875a9dec03cbd2bc65ba46d24d28e7d4d0f" exitCode=0 Mar 10 15:01:04 crc kubenswrapper[4911]: I0310 15:01:04.297641 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552581-nh5lq" event={"ID":"7a04a21f-13ba-40c3-89ea-a4e87b1fec25","Type":"ContainerDied","Data":"5ac578303932324e9254597d09ef9875a9dec03cbd2bc65ba46d24d28e7d4d0f"} Mar 10 15:01:05 crc kubenswrapper[4911]: I0310 15:01:05.772950 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:05 crc kubenswrapper[4911]: I0310 15:01:05.959202 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-combined-ca-bundle\") pod \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " Mar 10 15:01:05 crc kubenswrapper[4911]: I0310 15:01:05.959371 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-fernet-keys\") pod \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " Mar 10 15:01:05 crc kubenswrapper[4911]: I0310 15:01:05.959513 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vftgw\" (UniqueName: \"kubernetes.io/projected/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-kube-api-access-vftgw\") pod \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " Mar 10 15:01:05 crc kubenswrapper[4911]: I0310 15:01:05.962222 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-config-data\") pod \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\" (UID: \"7a04a21f-13ba-40c3-89ea-a4e87b1fec25\") " Mar 10 15:01:05 crc kubenswrapper[4911]: I0310 15:01:05.978187 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-kube-api-access-vftgw" (OuterVolumeSpecName: "kube-api-access-vftgw") pod "7a04a21f-13ba-40c3-89ea-a4e87b1fec25" (UID: "7a04a21f-13ba-40c3-89ea-a4e87b1fec25"). InnerVolumeSpecName "kube-api-access-vftgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.001270 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7a04a21f-13ba-40c3-89ea-a4e87b1fec25" (UID: "7a04a21f-13ba-40c3-89ea-a4e87b1fec25"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.040576 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a04a21f-13ba-40c3-89ea-a4e87b1fec25" (UID: "7a04a21f-13ba-40c3-89ea-a4e87b1fec25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.064964 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-config-data" (OuterVolumeSpecName: "config-data") pod "7a04a21f-13ba-40c3-89ea-a4e87b1fec25" (UID: "7a04a21f-13ba-40c3-89ea-a4e87b1fec25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.072340 4911 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.072380 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vftgw\" (UniqueName: \"kubernetes.io/projected/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-kube-api-access-vftgw\") on node \"crc\" DevicePath \"\"" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.072395 4911 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.072407 4911 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a04a21f-13ba-40c3-89ea-a4e87b1fec25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.319709 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552581-nh5lq" event={"ID":"7a04a21f-13ba-40c3-89ea-a4e87b1fec25","Type":"ContainerDied","Data":"705822bd3ac8332bd8f49326b3c37b548f75a07fa22df2ff170bc169a5878440"} Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.319776 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="705822bd3ac8332bd8f49326b3c37b548f75a07fa22df2ff170bc169a5878440" Mar 10 15:01:06 crc kubenswrapper[4911]: I0310 15:01:06.319835 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552581-nh5lq" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.053892 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4vz7n/must-gather-zmqcp"] Mar 10 15:01:24 crc kubenswrapper[4911]: E0310 15:01:24.055439 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a04a21f-13ba-40c3-89ea-a4e87b1fec25" containerName="keystone-cron" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.055454 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a04a21f-13ba-40c3-89ea-a4e87b1fec25" containerName="keystone-cron" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.055865 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a04a21f-13ba-40c3-89ea-a4e87b1fec25" containerName="keystone-cron" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.057431 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.065151 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4vz7n"/"openshift-service-ca.crt" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.065413 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4vz7n"/"default-dockercfg-bx57r" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.065805 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4vz7n"/"kube-root-ca.crt" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.102675 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4vz7n/must-gather-zmqcp"] Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.193817 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-must-gather-output\") pod \"must-gather-zmqcp\" (UID: \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\") " pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.194456 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trxvb\" (UniqueName: \"kubernetes.io/projected/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-kube-api-access-trxvb\") pod \"must-gather-zmqcp\" (UID: \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\") " pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.296391 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-must-gather-output\") pod \"must-gather-zmqcp\" (UID: \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\") " pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.296489 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trxvb\" (UniqueName: \"kubernetes.io/projected/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-kube-api-access-trxvb\") pod \"must-gather-zmqcp\" (UID: \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\") " pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.297352 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-must-gather-output\") pod \"must-gather-zmqcp\" (UID: \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\") " pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.318450 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trxvb\" (UniqueName: \"kubernetes.io/projected/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-kube-api-access-trxvb\") pod \"must-gather-zmqcp\" (UID: \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\") " pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.430063 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.950542 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:01:24 crc kubenswrapper[4911]: I0310 15:01:24.952859 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4vz7n/must-gather-zmqcp"] Mar 10 15:01:25 crc kubenswrapper[4911]: I0310 15:01:25.515838 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" event={"ID":"6ad1bc14-a3de-4320-90e3-7c49c5f3014f","Type":"ContainerStarted","Data":"aee65a015add86ea4bf07c3a3c081ebda825865035fb38b98dff37333305cb4a"} Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.613106 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" event={"ID":"6ad1bc14-a3de-4320-90e3-7c49c5f3014f","Type":"ContainerStarted","Data":"8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978"} Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.613772 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" event={"ID":"6ad1bc14-a3de-4320-90e3-7c49c5f3014f","Type":"ContainerStarted","Data":"c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77"} Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.636278 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" podStartSLOduration=3.066520332 podStartE2EDuration="9.636258644s" podCreationTimestamp="2026-03-10 15:01:23 +0000 UTC" firstStartedPulling="2026-03-10 15:01:24.95023065 +0000 UTC m=+3589.513750567" lastFinishedPulling="2026-03-10 15:01:31.519968962 +0000 UTC m=+3596.083488879" observedRunningTime="2026-03-10 15:01:32.632085481 +0000 UTC m=+3597.195605398" watchObservedRunningTime="2026-03-10 15:01:32.636258644 +0000 UTC m=+3597.199778561" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.773231 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9pxqh"] Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.776221 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.794634 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9pxqh"] Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.855272 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-catalog-content\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.855360 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nml\" (UniqueName: \"kubernetes.io/projected/a0f483dd-ff52-4387-862b-2d2c9c681788-kube-api-access-v9nml\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.855604 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-utilities\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.958260 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-utilities\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.958388 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-catalog-content\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.958471 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9nml\" (UniqueName: \"kubernetes.io/projected/a0f483dd-ff52-4387-862b-2d2c9c681788-kube-api-access-v9nml\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.958997 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-utilities\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.959013 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-catalog-content\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:32 crc kubenswrapper[4911]: I0310 15:01:32.985921 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9nml\" (UniqueName: \"kubernetes.io/projected/a0f483dd-ff52-4387-862b-2d2c9c681788-kube-api-access-v9nml\") pod \"certified-operators-9pxqh\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:33 crc kubenswrapper[4911]: I0310 15:01:33.121362 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:33 crc kubenswrapper[4911]: I0310 15:01:33.815890 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9pxqh"] Mar 10 15:01:34 crc kubenswrapper[4911]: I0310 15:01:34.676857 4911 generic.go:334] "Generic (PLEG): container finished" podID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerID="0f9807354ae6f128d50915453b62673d14b3a8740de91c3a2e5eb12ffc837bfc" exitCode=0 Mar 10 15:01:34 crc kubenswrapper[4911]: I0310 15:01:34.677027 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9pxqh" event={"ID":"a0f483dd-ff52-4387-862b-2d2c9c681788","Type":"ContainerDied","Data":"0f9807354ae6f128d50915453b62673d14b3a8740de91c3a2e5eb12ffc837bfc"} Mar 10 15:01:34 crc kubenswrapper[4911]: I0310 15:01:34.677448 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9pxqh" event={"ID":"a0f483dd-ff52-4387-862b-2d2c9c681788","Type":"ContainerStarted","Data":"1544e96280a780af8197c75850260dc0133057fc66f358e38d8ac7124b63fbff"} Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.285706 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-98x4h"] Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.288631 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.376619 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-host\") pod \"crc-debug-98x4h\" (UID: \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\") " pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.376739 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b6fx\" (UniqueName: \"kubernetes.io/projected/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-kube-api-access-6b6fx\") pod \"crc-debug-98x4h\" (UID: \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\") " pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.479042 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-host\") pod \"crc-debug-98x4h\" (UID: \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\") " pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.479113 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b6fx\" (UniqueName: \"kubernetes.io/projected/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-kube-api-access-6b6fx\") pod \"crc-debug-98x4h\" (UID: \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\") " pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.479573 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-host\") pod \"crc-debug-98x4h\" (UID: \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\") " pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.500407 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b6fx\" (UniqueName: \"kubernetes.io/projected/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-kube-api-access-6b6fx\") pod \"crc-debug-98x4h\" (UID: \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\") " pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.616901 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.701099 4911 generic.go:334] "Generic (PLEG): container finished" podID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerID="cfcc3956eb9cefb459bf2228bb85155a26b11d3c7194c25fc046a9b8c562f48f" exitCode=0 Mar 10 15:01:36 crc kubenswrapper[4911]: I0310 15:01:36.702171 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9pxqh" event={"ID":"a0f483dd-ff52-4387-862b-2d2c9c681788","Type":"ContainerDied","Data":"cfcc3956eb9cefb459bf2228bb85155a26b11d3c7194c25fc046a9b8c562f48f"} Mar 10 15:01:36 crc kubenswrapper[4911]: W0310 15:01:36.707261 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8e1a2e2_6391_4bd8_90b7_d555caa1bd9d.slice/crio-c4f13af858b35f0523758e483552b84eef9d6314233c54742c89d24ead374f66 WatchSource:0}: Error finding container c4f13af858b35f0523758e483552b84eef9d6314233c54742c89d24ead374f66: Status 404 returned error can't find the container with id c4f13af858b35f0523758e483552b84eef9d6314233c54742c89d24ead374f66 Mar 10 15:01:37 crc kubenswrapper[4911]: I0310 15:01:37.712998 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/crc-debug-98x4h" event={"ID":"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d","Type":"ContainerStarted","Data":"c4f13af858b35f0523758e483552b84eef9d6314233c54742c89d24ead374f66"} Mar 10 15:01:37 crc kubenswrapper[4911]: I0310 15:01:37.717414 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9pxqh" event={"ID":"a0f483dd-ff52-4387-862b-2d2c9c681788","Type":"ContainerStarted","Data":"cc1d2acb3868d4e9e154c9c17a4642885b2799497a3d531ac0ef30669c105ee1"} Mar 10 15:01:37 crc kubenswrapper[4911]: I0310 15:01:37.749175 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9pxqh" podStartSLOduration=3.143551089 podStartE2EDuration="5.74915007s" podCreationTimestamp="2026-03-10 15:01:32 +0000 UTC" firstStartedPulling="2026-03-10 15:01:34.680842641 +0000 UTC m=+3599.244362558" lastFinishedPulling="2026-03-10 15:01:37.286441622 +0000 UTC m=+3601.849961539" observedRunningTime="2026-03-10 15:01:37.742895522 +0000 UTC m=+3602.306415439" watchObservedRunningTime="2026-03-10 15:01:37.74915007 +0000 UTC m=+3602.312669987" Mar 10 15:01:43 crc kubenswrapper[4911]: I0310 15:01:43.122201 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:43 crc kubenswrapper[4911]: I0310 15:01:43.123194 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:43 crc kubenswrapper[4911]: I0310 15:01:43.195816 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:43 crc kubenswrapper[4911]: I0310 15:01:43.856772 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:43 crc kubenswrapper[4911]: I0310 15:01:43.927632 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9pxqh"] Mar 10 15:01:45 crc kubenswrapper[4911]: I0310 15:01:45.809239 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9pxqh" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerName="registry-server" containerID="cri-o://cc1d2acb3868d4e9e154c9c17a4642885b2799497a3d531ac0ef30669c105ee1" gracePeriod=2 Mar 10 15:01:46 crc kubenswrapper[4911]: I0310 15:01:46.823478 4911 generic.go:334] "Generic (PLEG): container finished" podID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerID="cc1d2acb3868d4e9e154c9c17a4642885b2799497a3d531ac0ef30669c105ee1" exitCode=0 Mar 10 15:01:46 crc kubenswrapper[4911]: I0310 15:01:46.823537 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9pxqh" event={"ID":"a0f483dd-ff52-4387-862b-2d2c9c681788","Type":"ContainerDied","Data":"cc1d2acb3868d4e9e154c9c17a4642885b2799497a3d531ac0ef30669c105ee1"} Mar 10 15:01:48 crc kubenswrapper[4911]: I0310 15:01:48.521167 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:01:48 crc kubenswrapper[4911]: I0310 15:01:48.521811 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.131683 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.177914 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-catalog-content\") pod \"a0f483dd-ff52-4387-862b-2d2c9c681788\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.178379 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9nml\" (UniqueName: \"kubernetes.io/projected/a0f483dd-ff52-4387-862b-2d2c9c681788-kube-api-access-v9nml\") pod \"a0f483dd-ff52-4387-862b-2d2c9c681788\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.178515 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-utilities\") pod \"a0f483dd-ff52-4387-862b-2d2c9c681788\" (UID: \"a0f483dd-ff52-4387-862b-2d2c9c681788\") " Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.179411 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-utilities" (OuterVolumeSpecName: "utilities") pod "a0f483dd-ff52-4387-862b-2d2c9c681788" (UID: "a0f483dd-ff52-4387-862b-2d2c9c681788"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.189270 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f483dd-ff52-4387-862b-2d2c9c681788-kube-api-access-v9nml" (OuterVolumeSpecName: "kube-api-access-v9nml") pod "a0f483dd-ff52-4387-862b-2d2c9c681788" (UID: "a0f483dd-ff52-4387-862b-2d2c9c681788"). InnerVolumeSpecName "kube-api-access-v9nml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.267168 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0f483dd-ff52-4387-862b-2d2c9c681788" (UID: "a0f483dd-ff52-4387-862b-2d2c9c681788"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.293779 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.293835 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9nml\" (UniqueName: \"kubernetes.io/projected/a0f483dd-ff52-4387-862b-2d2c9c681788-kube-api-access-v9nml\") on node \"crc\" DevicePath \"\"" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.293854 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f483dd-ff52-4387-862b-2d2c9c681788-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.878262 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9pxqh" event={"ID":"a0f483dd-ff52-4387-862b-2d2c9c681788","Type":"ContainerDied","Data":"1544e96280a780af8197c75850260dc0133057fc66f358e38d8ac7124b63fbff"} Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.878360 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9pxqh" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.878977 4911 scope.go:117] "RemoveContainer" containerID="cc1d2acb3868d4e9e154c9c17a4642885b2799497a3d531ac0ef30669c105ee1" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.881287 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/crc-debug-98x4h" event={"ID":"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d","Type":"ContainerStarted","Data":"ca41a574ddec81823410196108087996467f7c6edb130030fed788baea706471"} Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.911135 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4vz7n/crc-debug-98x4h" podStartSLOduration=1.763191168 podStartE2EDuration="15.911109152s" podCreationTimestamp="2026-03-10 15:01:36 +0000 UTC" firstStartedPulling="2026-03-10 15:01:36.71100736 +0000 UTC m=+3601.274527287" lastFinishedPulling="2026-03-10 15:01:50.858925354 +0000 UTC m=+3615.422445271" observedRunningTime="2026-03-10 15:01:51.900476286 +0000 UTC m=+3616.463996213" watchObservedRunningTime="2026-03-10 15:01:51.911109152 +0000 UTC m=+3616.474629069" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.916267 4911 scope.go:117] "RemoveContainer" containerID="cfcc3956eb9cefb459bf2228bb85155a26b11d3c7194c25fc046a9b8c562f48f" Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.952874 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9pxqh"] Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.970233 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9pxqh"] Mar 10 15:01:51 crc kubenswrapper[4911]: I0310 15:01:51.973104 4911 scope.go:117] "RemoveContainer" containerID="0f9807354ae6f128d50915453b62673d14b3a8740de91c3a2e5eb12ffc837bfc" Mar 10 15:01:52 crc kubenswrapper[4911]: I0310 15:01:52.211251 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" path="/var/lib/kubelet/pods/a0f483dd-ff52-4387-862b-2d2c9c681788/volumes" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.154539 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552582-qm5rq"] Mar 10 15:02:00 crc kubenswrapper[4911]: E0310 15:02:00.155776 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerName="registry-server" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.155842 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerName="registry-server" Mar 10 15:02:00 crc kubenswrapper[4911]: E0310 15:02:00.155881 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerName="extract-utilities" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.155889 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerName="extract-utilities" Mar 10 15:02:00 crc kubenswrapper[4911]: E0310 15:02:00.155900 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerName="extract-content" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.155905 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerName="extract-content" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.156173 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f483dd-ff52-4387-862b-2d2c9c681788" containerName="registry-server" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.157122 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552582-qm5rq" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.160009 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.160129 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.160546 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.164552 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552582-qm5rq"] Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.293440 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvtzh\" (UniqueName: \"kubernetes.io/projected/cf406f01-5c7f-4439-8061-c41d07896874-kube-api-access-vvtzh\") pod \"auto-csr-approver-29552582-qm5rq\" (UID: \"cf406f01-5c7f-4439-8061-c41d07896874\") " pod="openshift-infra/auto-csr-approver-29552582-qm5rq" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.396992 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvtzh\" (UniqueName: \"kubernetes.io/projected/cf406f01-5c7f-4439-8061-c41d07896874-kube-api-access-vvtzh\") pod \"auto-csr-approver-29552582-qm5rq\" (UID: \"cf406f01-5c7f-4439-8061-c41d07896874\") " pod="openshift-infra/auto-csr-approver-29552582-qm5rq" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.424261 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvtzh\" (UniqueName: \"kubernetes.io/projected/cf406f01-5c7f-4439-8061-c41d07896874-kube-api-access-vvtzh\") pod \"auto-csr-approver-29552582-qm5rq\" (UID: \"cf406f01-5c7f-4439-8061-c41d07896874\") " pod="openshift-infra/auto-csr-approver-29552582-qm5rq" Mar 10 15:02:00 crc kubenswrapper[4911]: I0310 15:02:00.485507 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552582-qm5rq" Mar 10 15:02:01 crc kubenswrapper[4911]: I0310 15:02:01.112883 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552582-qm5rq"] Mar 10 15:02:01 crc kubenswrapper[4911]: I0310 15:02:01.986145 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552582-qm5rq" event={"ID":"cf406f01-5c7f-4439-8061-c41d07896874","Type":"ContainerStarted","Data":"d8a802cf3b3b74bf2832ebc6c16c37cc72504459073a9c0d9ec9663150dfae7d"} Mar 10 15:02:02 crc kubenswrapper[4911]: I0310 15:02:02.998297 4911 generic.go:334] "Generic (PLEG): container finished" podID="cf406f01-5c7f-4439-8061-c41d07896874" containerID="0495e2552c3e735b6e7b76824918427b15f9dc9fb93bad6cd0394544eed53aaa" exitCode=0 Mar 10 15:02:02 crc kubenswrapper[4911]: I0310 15:02:02.998420 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552582-qm5rq" event={"ID":"cf406f01-5c7f-4439-8061-c41d07896874","Type":"ContainerDied","Data":"0495e2552c3e735b6e7b76824918427b15f9dc9fb93bad6cd0394544eed53aaa"} Mar 10 15:02:05 crc kubenswrapper[4911]: I0310 15:02:05.025410 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552582-qm5rq" event={"ID":"cf406f01-5c7f-4439-8061-c41d07896874","Type":"ContainerDied","Data":"d8a802cf3b3b74bf2832ebc6c16c37cc72504459073a9c0d9ec9663150dfae7d"} Mar 10 15:02:05 crc kubenswrapper[4911]: I0310 15:02:05.026012 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8a802cf3b3b74bf2832ebc6c16c37cc72504459073a9c0d9ec9663150dfae7d" Mar 10 15:02:05 crc kubenswrapper[4911]: I0310 15:02:05.116293 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552582-qm5rq" Mar 10 15:02:05 crc kubenswrapper[4911]: I0310 15:02:05.204421 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvtzh\" (UniqueName: \"kubernetes.io/projected/cf406f01-5c7f-4439-8061-c41d07896874-kube-api-access-vvtzh\") pod \"cf406f01-5c7f-4439-8061-c41d07896874\" (UID: \"cf406f01-5c7f-4439-8061-c41d07896874\") " Mar 10 15:02:05 crc kubenswrapper[4911]: I0310 15:02:05.212679 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf406f01-5c7f-4439-8061-c41d07896874-kube-api-access-vvtzh" (OuterVolumeSpecName: "kube-api-access-vvtzh") pod "cf406f01-5c7f-4439-8061-c41d07896874" (UID: "cf406f01-5c7f-4439-8061-c41d07896874"). InnerVolumeSpecName "kube-api-access-vvtzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:02:05 crc kubenswrapper[4911]: I0310 15:02:05.306981 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvtzh\" (UniqueName: \"kubernetes.io/projected/cf406f01-5c7f-4439-8061-c41d07896874-kube-api-access-vvtzh\") on node \"crc\" DevicePath \"\"" Mar 10 15:02:06 crc kubenswrapper[4911]: I0310 15:02:06.033345 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552582-qm5rq" Mar 10 15:02:06 crc kubenswrapper[4911]: I0310 15:02:06.224114 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552576-9kx6p"] Mar 10 15:02:06 crc kubenswrapper[4911]: I0310 15:02:06.236799 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552576-9kx6p"] Mar 10 15:02:08 crc kubenswrapper[4911]: I0310 15:02:08.205329 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b1a3fe-2219-4c28-9b35-f1e0a9e4977c" path="/var/lib/kubelet/pods/47b1a3fe-2219-4c28-9b35-f1e0a9e4977c/volumes" Mar 10 15:02:18 crc kubenswrapper[4911]: I0310 15:02:18.520875 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:02:18 crc kubenswrapper[4911]: I0310 15:02:18.521531 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:02:33 crc kubenswrapper[4911]: I0310 15:02:33.325842 4911 generic.go:334] "Generic (PLEG): container finished" podID="b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d" containerID="ca41a574ddec81823410196108087996467f7c6edb130030fed788baea706471" exitCode=0 Mar 10 15:02:33 crc kubenswrapper[4911]: I0310 15:02:33.325962 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/crc-debug-98x4h" event={"ID":"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d","Type":"ContainerDied","Data":"ca41a574ddec81823410196108087996467f7c6edb130030fed788baea706471"} Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.465432 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.516486 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-98x4h"] Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.527301 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-98x4h"] Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.665931 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b6fx\" (UniqueName: \"kubernetes.io/projected/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-kube-api-access-6b6fx\") pod \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\" (UID: \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\") " Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.666164 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-host\") pod \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\" (UID: \"b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d\") " Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.666352 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-host" (OuterVolumeSpecName: "host") pod "b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d" (UID: "b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.666598 4911 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-host\") on node \"crc\" DevicePath \"\"" Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.672894 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-kube-api-access-6b6fx" (OuterVolumeSpecName: "kube-api-access-6b6fx") pod "b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d" (UID: "b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d"). InnerVolumeSpecName "kube-api-access-6b6fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:02:34 crc kubenswrapper[4911]: I0310 15:02:34.768401 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b6fx\" (UniqueName: \"kubernetes.io/projected/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d-kube-api-access-6b6fx\") on node \"crc\" DevicePath \"\"" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.350446 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4f13af858b35f0523758e483552b84eef9d6314233c54742c89d24ead374f66" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.350526 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-98x4h" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.738356 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-g728m"] Mar 10 15:02:35 crc kubenswrapper[4911]: E0310 15:02:35.738950 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf406f01-5c7f-4439-8061-c41d07896874" containerName="oc" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.738968 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf406f01-5c7f-4439-8061-c41d07896874" containerName="oc" Mar 10 15:02:35 crc kubenswrapper[4911]: E0310 15:02:35.739002 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d" containerName="container-00" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.739009 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d" containerName="container-00" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.739820 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf406f01-5c7f-4439-8061-c41d07896874" containerName="oc" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.739879 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d" containerName="container-00" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.740952 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.895996 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9b4b\" (UniqueName: \"kubernetes.io/projected/88ca794b-5d9b-42fc-b807-39186979a35d-kube-api-access-t9b4b\") pod \"crc-debug-g728m\" (UID: \"88ca794b-5d9b-42fc-b807-39186979a35d\") " pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.896058 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88ca794b-5d9b-42fc-b807-39186979a35d-host\") pod \"crc-debug-g728m\" (UID: \"88ca794b-5d9b-42fc-b807-39186979a35d\") " pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:35 crc kubenswrapper[4911]: I0310 15:02:35.999623 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9b4b\" (UniqueName: \"kubernetes.io/projected/88ca794b-5d9b-42fc-b807-39186979a35d-kube-api-access-t9b4b\") pod \"crc-debug-g728m\" (UID: \"88ca794b-5d9b-42fc-b807-39186979a35d\") " pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:36 crc kubenswrapper[4911]: I0310 15:02:36.000014 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88ca794b-5d9b-42fc-b807-39186979a35d-host\") pod \"crc-debug-g728m\" (UID: \"88ca794b-5d9b-42fc-b807-39186979a35d\") " pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:36 crc kubenswrapper[4911]: I0310 15:02:36.000167 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88ca794b-5d9b-42fc-b807-39186979a35d-host\") pod \"crc-debug-g728m\" (UID: \"88ca794b-5d9b-42fc-b807-39186979a35d\") " pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:36 crc kubenswrapper[4911]: I0310 15:02:36.018048 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9b4b\" (UniqueName: \"kubernetes.io/projected/88ca794b-5d9b-42fc-b807-39186979a35d-kube-api-access-t9b4b\") pod \"crc-debug-g728m\" (UID: \"88ca794b-5d9b-42fc-b807-39186979a35d\") " pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:36 crc kubenswrapper[4911]: I0310 15:02:36.057076 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:36 crc kubenswrapper[4911]: I0310 15:02:36.212390 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d" path="/var/lib/kubelet/pods/b8e1a2e2-6391-4bd8-90b7-d555caa1bd9d/volumes" Mar 10 15:02:36 crc kubenswrapper[4911]: I0310 15:02:36.362016 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/crc-debug-g728m" event={"ID":"88ca794b-5d9b-42fc-b807-39186979a35d","Type":"ContainerStarted","Data":"937980658f5d1594d1f0fa3a173b36f09f330b5869b0918be8f31942e949a20c"} Mar 10 15:02:36 crc kubenswrapper[4911]: I0310 15:02:36.362100 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/crc-debug-g728m" event={"ID":"88ca794b-5d9b-42fc-b807-39186979a35d","Type":"ContainerStarted","Data":"83781791091a13a901eb17f12c963784cc42978b63c604f39480bd00049b7084"} Mar 10 15:02:36 crc kubenswrapper[4911]: I0310 15:02:36.379679 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4vz7n/crc-debug-g728m" podStartSLOduration=1.379648572 podStartE2EDuration="1.379648572s" podCreationTimestamp="2026-03-10 15:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:02:36.375963778 +0000 UTC m=+3660.939483705" watchObservedRunningTime="2026-03-10 15:02:36.379648572 +0000 UTC m=+3660.943168499" Mar 10 15:02:37 crc kubenswrapper[4911]: I0310 15:02:37.372624 4911 generic.go:334] "Generic (PLEG): container finished" podID="88ca794b-5d9b-42fc-b807-39186979a35d" containerID="937980658f5d1594d1f0fa3a173b36f09f330b5869b0918be8f31942e949a20c" exitCode=0 Mar 10 15:02:37 crc kubenswrapper[4911]: I0310 15:02:37.372693 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/crc-debug-g728m" event={"ID":"88ca794b-5d9b-42fc-b807-39186979a35d","Type":"ContainerDied","Data":"937980658f5d1594d1f0fa3a173b36f09f330b5869b0918be8f31942e949a20c"} Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.493694 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.526210 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-g728m"] Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.535351 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-g728m"] Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.654935 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9b4b\" (UniqueName: \"kubernetes.io/projected/88ca794b-5d9b-42fc-b807-39186979a35d-kube-api-access-t9b4b\") pod \"88ca794b-5d9b-42fc-b807-39186979a35d\" (UID: \"88ca794b-5d9b-42fc-b807-39186979a35d\") " Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.655059 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88ca794b-5d9b-42fc-b807-39186979a35d-host\") pod \"88ca794b-5d9b-42fc-b807-39186979a35d\" (UID: \"88ca794b-5d9b-42fc-b807-39186979a35d\") " Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.655623 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88ca794b-5d9b-42fc-b807-39186979a35d-host" (OuterVolumeSpecName: "host") pod "88ca794b-5d9b-42fc-b807-39186979a35d" (UID: "88ca794b-5d9b-42fc-b807-39186979a35d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.661961 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ca794b-5d9b-42fc-b807-39186979a35d-kube-api-access-t9b4b" (OuterVolumeSpecName: "kube-api-access-t9b4b") pod "88ca794b-5d9b-42fc-b807-39186979a35d" (UID: "88ca794b-5d9b-42fc-b807-39186979a35d"). InnerVolumeSpecName "kube-api-access-t9b4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.757408 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9b4b\" (UniqueName: \"kubernetes.io/projected/88ca794b-5d9b-42fc-b807-39186979a35d-kube-api-access-t9b4b\") on node \"crc\" DevicePath \"\"" Mar 10 15:02:38 crc kubenswrapper[4911]: I0310 15:02:38.757454 4911 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88ca794b-5d9b-42fc-b807-39186979a35d-host\") on node \"crc\" DevicePath \"\"" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.401404 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83781791091a13a901eb17f12c963784cc42978b63c604f39480bd00049b7084" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.401537 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-g728m" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.706681 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-dbhpr"] Mar 10 15:02:39 crc kubenswrapper[4911]: E0310 15:02:39.707642 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ca794b-5d9b-42fc-b807-39186979a35d" containerName="container-00" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.707659 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ca794b-5d9b-42fc-b807-39186979a35d" containerName="container-00" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.707948 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ca794b-5d9b-42fc-b807-39186979a35d" containerName="container-00" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.708741 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.781956 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ef9046-bbb2-49fd-97e5-d67ec3057127-host\") pod \"crc-debug-dbhpr\" (UID: \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\") " pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.782040 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4958q\" (UniqueName: \"kubernetes.io/projected/d5ef9046-bbb2-49fd-97e5-d67ec3057127-kube-api-access-4958q\") pod \"crc-debug-dbhpr\" (UID: \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\") " pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.884956 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ef9046-bbb2-49fd-97e5-d67ec3057127-host\") pod \"crc-debug-dbhpr\" (UID: \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\") " pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.885023 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4958q\" (UniqueName: \"kubernetes.io/projected/d5ef9046-bbb2-49fd-97e5-d67ec3057127-kube-api-access-4958q\") pod \"crc-debug-dbhpr\" (UID: \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\") " pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.885159 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ef9046-bbb2-49fd-97e5-d67ec3057127-host\") pod \"crc-debug-dbhpr\" (UID: \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\") " pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:39 crc kubenswrapper[4911]: I0310 15:02:39.905757 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4958q\" (UniqueName: \"kubernetes.io/projected/d5ef9046-bbb2-49fd-97e5-d67ec3057127-kube-api-access-4958q\") pod \"crc-debug-dbhpr\" (UID: \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\") " pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:40 crc kubenswrapper[4911]: I0310 15:02:40.035161 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:40 crc kubenswrapper[4911]: W0310 15:02:40.075015 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ef9046_bbb2_49fd_97e5_d67ec3057127.slice/crio-e791aedd813524af57ef699159be0779460c6945df1aead7e2467bbfa1249015 WatchSource:0}: Error finding container e791aedd813524af57ef699159be0779460c6945df1aead7e2467bbfa1249015: Status 404 returned error can't find the container with id e791aedd813524af57ef699159be0779460c6945df1aead7e2467bbfa1249015 Mar 10 15:02:40 crc kubenswrapper[4911]: I0310 15:02:40.207869 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ca794b-5d9b-42fc-b807-39186979a35d" path="/var/lib/kubelet/pods/88ca794b-5d9b-42fc-b807-39186979a35d/volumes" Mar 10 15:02:40 crc kubenswrapper[4911]: I0310 15:02:40.415840 4911 generic.go:334] "Generic (PLEG): container finished" podID="d5ef9046-bbb2-49fd-97e5-d67ec3057127" containerID="37c221e367a0745ca8dd3f4d3cabe241eca75dbe1036ed88a0fcd8fcfc400dbc" exitCode=0 Mar 10 15:02:40 crc kubenswrapper[4911]: I0310 15:02:40.415910 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" event={"ID":"d5ef9046-bbb2-49fd-97e5-d67ec3057127","Type":"ContainerDied","Data":"37c221e367a0745ca8dd3f4d3cabe241eca75dbe1036ed88a0fcd8fcfc400dbc"} Mar 10 15:02:40 crc kubenswrapper[4911]: I0310 15:02:40.416025 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" event={"ID":"d5ef9046-bbb2-49fd-97e5-d67ec3057127","Type":"ContainerStarted","Data":"e791aedd813524af57ef699159be0779460c6945df1aead7e2467bbfa1249015"} Mar 10 15:02:40 crc kubenswrapper[4911]: I0310 15:02:40.465638 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-dbhpr"] Mar 10 15:02:40 crc kubenswrapper[4911]: I0310 15:02:40.475155 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4vz7n/crc-debug-dbhpr"] Mar 10 15:02:41 crc kubenswrapper[4911]: I0310 15:02:41.569857 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:41 crc kubenswrapper[4911]: I0310 15:02:41.722706 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ef9046-bbb2-49fd-97e5-d67ec3057127-host\") pod \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\" (UID: \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\") " Mar 10 15:02:41 crc kubenswrapper[4911]: I0310 15:02:41.722854 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5ef9046-bbb2-49fd-97e5-d67ec3057127-host" (OuterVolumeSpecName: "host") pod "d5ef9046-bbb2-49fd-97e5-d67ec3057127" (UID: "d5ef9046-bbb2-49fd-97e5-d67ec3057127"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:02:41 crc kubenswrapper[4911]: I0310 15:02:41.723508 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4958q\" (UniqueName: \"kubernetes.io/projected/d5ef9046-bbb2-49fd-97e5-d67ec3057127-kube-api-access-4958q\") pod \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\" (UID: \"d5ef9046-bbb2-49fd-97e5-d67ec3057127\") " Mar 10 15:02:41 crc kubenswrapper[4911]: I0310 15:02:41.724241 4911 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ef9046-bbb2-49fd-97e5-d67ec3057127-host\") on node \"crc\" DevicePath \"\"" Mar 10 15:02:41 crc kubenswrapper[4911]: I0310 15:02:41.731063 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ef9046-bbb2-49fd-97e5-d67ec3057127-kube-api-access-4958q" (OuterVolumeSpecName: "kube-api-access-4958q") pod "d5ef9046-bbb2-49fd-97e5-d67ec3057127" (UID: "d5ef9046-bbb2-49fd-97e5-d67ec3057127"). InnerVolumeSpecName "kube-api-access-4958q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:02:41 crc kubenswrapper[4911]: I0310 15:02:41.826416 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4958q\" (UniqueName: \"kubernetes.io/projected/d5ef9046-bbb2-49fd-97e5-d67ec3057127-kube-api-access-4958q\") on node \"crc\" DevicePath \"\"" Mar 10 15:02:42 crc kubenswrapper[4911]: I0310 15:02:42.206605 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ef9046-bbb2-49fd-97e5-d67ec3057127" path="/var/lib/kubelet/pods/d5ef9046-bbb2-49fd-97e5-d67ec3057127/volumes" Mar 10 15:02:42 crc kubenswrapper[4911]: I0310 15:02:42.440179 4911 scope.go:117] "RemoveContainer" containerID="37c221e367a0745ca8dd3f4d3cabe241eca75dbe1036ed88a0fcd8fcfc400dbc" Mar 10 15:02:42 crc kubenswrapper[4911]: I0310 15:02:42.440239 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/crc-debug-dbhpr" Mar 10 15:02:48 crc kubenswrapper[4911]: I0310 15:02:48.521446 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:02:48 crc kubenswrapper[4911]: I0310 15:02:48.522507 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:02:48 crc kubenswrapper[4911]: I0310 15:02:48.522567 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 15:02:48 crc kubenswrapper[4911]: I0310 15:02:48.523812 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:02:48 crc kubenswrapper[4911]: I0310 15:02:48.523885 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" gracePeriod=600 Mar 10 15:02:48 crc kubenswrapper[4911]: E0310 15:02:48.650019 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:02:49 crc kubenswrapper[4911]: I0310 15:02:49.525373 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" exitCode=0 Mar 10 15:02:49 crc kubenswrapper[4911]: I0310 15:02:49.525440 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848"} Mar 10 15:02:49 crc kubenswrapper[4911]: I0310 15:02:49.525512 4911 scope.go:117] "RemoveContainer" containerID="19ad5193c0bc4ed506cb6042cc3e642a139d1b08dd49628d1a8a9e56568fa86f" Mar 10 15:02:49 crc kubenswrapper[4911]: I0310 15:02:49.526549 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:02:49 crc kubenswrapper[4911]: E0310 15:02:49.526988 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:02:56 crc kubenswrapper[4911]: I0310 15:02:56.594190 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6fb47b4698-gx22c_2689d664-9bf8-4c5b-8c53-353286854071/barbican-api/0.log" Mar 10 15:02:56 crc kubenswrapper[4911]: I0310 15:02:56.699203 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6fb47b4698-gx22c_2689d664-9bf8-4c5b-8c53-353286854071/barbican-api-log/0.log" Mar 10 15:02:56 crc kubenswrapper[4911]: I0310 15:02:56.826159 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8f48b4f88-jjg7s_d66c76f8-6b9a-40d3-b5fc-d2d5790928f6/barbican-keystone-listener/0.log" Mar 10 15:02:56 crc kubenswrapper[4911]: I0310 15:02:56.866991 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8f48b4f88-jjg7s_d66c76f8-6b9a-40d3-b5fc-d2d5790928f6/barbican-keystone-listener-log/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.028589 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8599db9-k9r6m_c049ccee-c503-43b1-b263-c6ee453e93e0/barbican-worker/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.090649 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8599db9-k9r6m_c049ccee-c503-43b1-b263-c6ee453e93e0/barbican-worker-log/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.242022 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8_98224edf-8b07-4753-87d9-4f6060957d74/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.311961 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_92bb8486-3729-4b5d-8f09-b99baf382c52/ceilometer-central-agent/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.408162 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_92bb8486-3729-4b5d-8f09-b99baf382c52/ceilometer-notification-agent/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.459814 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_92bb8486-3729-4b5d-8f09-b99baf382c52/proxy-httpd/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.478632 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_92bb8486-3729-4b5d-8f09-b99baf382c52/sg-core/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.687035 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6e666af1-a2f4-4aa0-95c6-f8568be705d8/cinder-api/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.706141 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6e666af1-a2f4-4aa0-95c6-f8568be705d8/cinder-api-log/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.841122 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b634ed72-d485-42b9-a382-24974c25ab42/cinder-scheduler/0.log" Mar 10 15:02:57 crc kubenswrapper[4911]: I0310 15:02:57.957412 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b634ed72-d485-42b9-a382-24974c25ab42/probe/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.040436 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fstmx_7ea3ab89-1a92-47f9-85a5-3df48990343b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.163635 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct_db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.324770 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q56d9_ec6c0ebc-e82c-4981-a32c-8ee98d9496ec/init/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.466936 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q56d9_ec6c0ebc-e82c-4981-a32c-8ee98d9496ec/init/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.512331 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q56d9_ec6c0ebc-e82c-4981-a32c-8ee98d9496ec/dnsmasq-dns/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.572601 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb_37e4aed4-039e-4b2b-89d7-65c43eb8f688/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.682140 4911 scope.go:117] "RemoveContainer" containerID="eddcb6b3f5cd7a6611d592b1865b61840f6dcebbca6b3b5ffa641285ac14b120" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.745160 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_249c149f-3423-4163-8358-b36f6d55c6f3/glance-httpd/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.760962 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_249c149f-3423-4163-8358-b36f6d55c6f3/glance-log/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.934851 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_001c3353-3ca1-444c-a741-b2447e3ca566/glance-httpd/0.log" Mar 10 15:02:58 crc kubenswrapper[4911]: I0310 15:02:58.963439 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_001c3353-3ca1-444c-a741-b2447e3ca566/glance-log/0.log" Mar 10 15:02:59 crc kubenswrapper[4911]: I0310 15:02:59.127853 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54d884b5d4-lsz26_6be9e57d-52b9-4de2-9201-1b85feda712c/horizon/0.log" Mar 10 15:02:59 crc kubenswrapper[4911]: I0310 15:02:59.318836 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr_5d1a5e0b-536c-4d5f-9c65-595361611fcd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:02:59 crc kubenswrapper[4911]: I0310 15:02:59.515455 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54d884b5d4-lsz26_6be9e57d-52b9-4de2-9201-1b85feda712c/horizon-log/0.log" Mar 10 15:02:59 crc kubenswrapper[4911]: I0310 15:02:59.577211 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-p9w8h_6c29812e-9268-4508-aef7-cb43fe278c8d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:02:59 crc kubenswrapper[4911]: I0310 15:02:59.851632 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29552581-nh5lq_7a04a21f-13ba-40c3-89ea-a4e87b1fec25/keystone-cron/0.log" Mar 10 15:02:59 crc kubenswrapper[4911]: I0310 15:02:59.907429 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76d846bbc6-4wr5p_3370ba4c-d284-4d51-8b2d-d1da50950def/keystone-api/0.log" Mar 10 15:03:00 crc kubenswrapper[4911]: I0310 15:03:00.040036 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cd645aa4-53be-4ede-a00b-e294626fc333/kube-state-metrics/0.log" Mar 10 15:03:00 crc kubenswrapper[4911]: I0310 15:03:00.158062 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mrq75_2d1eaf3f-414a-426a-8dbf-15825613d50a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:00 crc kubenswrapper[4911]: I0310 15:03:00.787118 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d48f5c7d5-2xxzq_5be3e6b2-8478-41bf-9fb1-09e053e8b5ac/neutron-api/0.log" Mar 10 15:03:00 crc kubenswrapper[4911]: I0310 15:03:00.820607 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d48f5c7d5-2xxzq_5be3e6b2-8478-41bf-9fb1-09e053e8b5ac/neutron-httpd/0.log" Mar 10 15:03:01 crc kubenswrapper[4911]: I0310 15:03:01.039854 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf_9fffeac8-b15e-48c2-a04e-7f6b6b28e142/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:01 crc kubenswrapper[4911]: I0310 15:03:01.193844 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:03:01 crc kubenswrapper[4911]: E0310 15:03:01.194466 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:03:01 crc kubenswrapper[4911]: I0310 15:03:01.639973 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_af0163a2-67bb-4bff-b4c7-c525f764e808/nova-cell0-conductor-conductor/0.log" Mar 10 15:03:01 crc kubenswrapper[4911]: I0310 15:03:01.659235 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_17b26bd6-b922-485e-9655-001a52e6731c/nova-api-log/0.log" Mar 10 15:03:01 crc kubenswrapper[4911]: I0310 15:03:01.875176 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_17b26bd6-b922-485e-9655-001a52e6731c/nova-api-api/0.log" Mar 10 15:03:01 crc kubenswrapper[4911]: I0310 15:03:01.986115 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d81b0082-b7ae-4d38-8dd2-5d20459aa493/nova-cell1-conductor-conductor/0.log" Mar 10 15:03:01 crc kubenswrapper[4911]: I0310 15:03:01.996371 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3b68c0d5-c7e3-4b1d-b9a0-337f56619c45/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 15:03:02 crc kubenswrapper[4911]: I0310 15:03:02.185601 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dksjc_43b7e07c-895e-46e1-9863-4dc4845a72ea/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:02 crc kubenswrapper[4911]: I0310 15:03:02.379150 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_607959a6-b845-45ea-b09a-966237b6dd1a/nova-metadata-log/0.log" Mar 10 15:03:02 crc kubenswrapper[4911]: I0310 15:03:02.689951 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f3e72e32-585d-4c71-9788-fd40c839f2ed/nova-scheduler-scheduler/0.log" Mar 10 15:03:02 crc kubenswrapper[4911]: I0310 15:03:02.697614 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6e7efec5-8494-472d-b149-a6aeed4810b2/mysql-bootstrap/0.log" Mar 10 15:03:02 crc kubenswrapper[4911]: I0310 15:03:02.916591 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6e7efec5-8494-472d-b149-a6aeed4810b2/mysql-bootstrap/0.log" Mar 10 15:03:02 crc kubenswrapper[4911]: I0310 15:03:02.924544 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6e7efec5-8494-472d-b149-a6aeed4810b2/galera/0.log" Mar 10 15:03:03 crc kubenswrapper[4911]: I0310 15:03:03.161603 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5ff8ebc9-ea10-4e9c-be23-96608817ed84/mysql-bootstrap/0.log" Mar 10 15:03:03 crc kubenswrapper[4911]: I0310 15:03:03.431405 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5ff8ebc9-ea10-4e9c-be23-96608817ed84/mysql-bootstrap/0.log" Mar 10 15:03:03 crc kubenswrapper[4911]: I0310 15:03:03.460103 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5ff8ebc9-ea10-4e9c-be23-96608817ed84/galera/0.log" Mar 10 15:03:03 crc kubenswrapper[4911]: I0310 15:03:03.604545 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_350c17be-173a-480f-bb79-314043291d4d/openstackclient/0.log" Mar 10 15:03:03 crc kubenswrapper[4911]: I0310 15:03:03.688093 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_607959a6-b845-45ea-b09a-966237b6dd1a/nova-metadata-metadata/0.log" Mar 10 15:03:03 crc kubenswrapper[4911]: I0310 15:03:03.715193 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9vssn_e43fdd12-0361-428c-8318-d1cec1c95399/ovn-controller/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.003194 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6m8bl_1c9db65b-8d56-4b07-86cd-dc73f3aa87fe/openstack-network-exporter/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.042531 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5hbsq_6941e0ca-8689-452e-82e4-d233cbbd45ec/ovsdb-server-init/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.261608 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5hbsq_6941e0ca-8689-452e-82e4-d233cbbd45ec/ovsdb-server/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.278681 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5hbsq_6941e0ca-8689-452e-82e4-d233cbbd45ec/ovsdb-server-init/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.297525 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5hbsq_6941e0ca-8689-452e-82e4-d233cbbd45ec/ovs-vswitchd/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.525361 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-prhkd_3d4d304b-5bae-475d-9d99-da422d354bb0/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.676271 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_43711b1d-4425-4081-ad98-ecee8b8c73c7/ovn-northd/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.694829 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_43711b1d-4425-4081-ad98-ecee8b8c73c7/openstack-network-exporter/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.852198 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a8355c9-0644-458c-9df7-bbbfd01fc249/openstack-network-exporter/0.log" Mar 10 15:03:04 crc kubenswrapper[4911]: I0310 15:03:04.946937 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a8355c9-0644-458c-9df7-bbbfd01fc249/ovsdbserver-nb/0.log" Mar 10 15:03:05 crc kubenswrapper[4911]: I0310 15:03:05.090093 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48ab1a9d-fcce-4cdf-8e73-cae7562b4269/openstack-network-exporter/0.log" Mar 10 15:03:05 crc kubenswrapper[4911]: I0310 15:03:05.095054 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48ab1a9d-fcce-4cdf-8e73-cae7562b4269/ovsdbserver-sb/0.log" Mar 10 15:03:05 crc kubenswrapper[4911]: I0310 15:03:05.326384 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-577d67f998-s8wh9_6f385703-b741-42ec-a63e-ec5a371859de/placement-api/0.log" Mar 10 15:03:05 crc kubenswrapper[4911]: I0310 15:03:05.469949 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-577d67f998-s8wh9_6f385703-b741-42ec-a63e-ec5a371859de/placement-log/0.log" Mar 10 15:03:05 crc kubenswrapper[4911]: I0310 15:03:05.513021 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0480ed86-7666-490a-9cd0-78a5ba05dac7/setup-container/0.log" Mar 10 15:03:05 crc kubenswrapper[4911]: I0310 15:03:05.669447 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0480ed86-7666-490a-9cd0-78a5ba05dac7/setup-container/0.log" Mar 10 15:03:05 crc kubenswrapper[4911]: I0310 15:03:05.712891 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0480ed86-7666-490a-9cd0-78a5ba05dac7/rabbitmq/0.log" Mar 10 15:03:05 crc kubenswrapper[4911]: I0310 15:03:05.783644 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_85cb7ff2-e47f-46ad-a30d-6442c0fde95f/setup-container/0.log" Mar 10 15:03:06 crc kubenswrapper[4911]: I0310 15:03:06.073056 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_85cb7ff2-e47f-46ad-a30d-6442c0fde95f/setup-container/0.log" Mar 10 15:03:06 crc kubenswrapper[4911]: I0310 15:03:06.101019 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t_7b847208-7241-442f-8b60-b153986d1ea3/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:06 crc kubenswrapper[4911]: I0310 15:03:06.124143 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_85cb7ff2-e47f-46ad-a30d-6442c0fde95f/rabbitmq/0.log" Mar 10 15:03:06 crc kubenswrapper[4911]: I0310 15:03:06.345180 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-rc5zm_05cc5850-302b-49b9-a8d3-62654314670a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:06 crc kubenswrapper[4911]: I0310 15:03:06.478662 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8_3f912ff3-8e6b-4757-8708-865cb96e132e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:06 crc kubenswrapper[4911]: I0310 15:03:06.698080 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wcn6n_8c81ff0d-aedd-419d-b159-b2e36b895839/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:06 crc kubenswrapper[4911]: I0310 15:03:06.751219 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-qhk66_02e6e27d-b387-4fa4-993a-525b581993c1/ssh-known-hosts-edpm-deployment/0.log" Mar 10 15:03:06 crc kubenswrapper[4911]: I0310 15:03:06.975800 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-69f9f96d6c-plmfc_7f8852b3-f34b-4a37-b546-b7bd6b595203/proxy-server/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.015996 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-69f9f96d6c-plmfc_7f8852b3-f34b-4a37-b546-b7bd6b595203/proxy-httpd/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.164168 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hkqgc_4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a/swift-ring-rebalance/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.276674 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/account-auditor/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.390235 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/account-reaper/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.450483 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/account-replicator/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.549092 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/account-server/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.555448 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/container-auditor/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.663430 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/container-replicator/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.761664 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/container-server/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.806133 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/container-updater/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.859591 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-auditor/0.log" Mar 10 15:03:07 crc kubenswrapper[4911]: I0310 15:03:07.924992 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-expirer/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.034940 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-server/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.046987 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-replicator/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.185989 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-updater/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.216689 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/rsync/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.303234 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/swift-recon-cron/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.483352 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6_1fe4191c-9c8e-4d7c-9323-0fce2c397878/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.546968 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c6a78318-420c-43fe-98f3-9306e18ee2d4/tempest-tests-tempest-tests-runner/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.693872 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_736e35e8-db53-456b-a374-50f70159f967/test-operator-logs-container/0.log" Mar 10 15:03:08 crc kubenswrapper[4911]: I0310 15:03:08.835267 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-nttsn_94cde38e-e826-4cad-9f7a-55e42ec4964a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:03:12 crc kubenswrapper[4911]: I0310 15:03:12.195124 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:03:12 crc kubenswrapper[4911]: E0310 15:03:12.196016 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:03:19 crc kubenswrapper[4911]: I0310 15:03:19.626444 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_45599797-9a4e-428b-8f95-39b6db7bd84e/memcached/0.log" Mar 10 15:03:27 crc kubenswrapper[4911]: I0310 15:03:27.193824 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:03:27 crc kubenswrapper[4911]: E0310 15:03:27.197306 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:03:35 crc kubenswrapper[4911]: I0310 15:03:35.929834 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/util/0.log" Mar 10 15:03:36 crc kubenswrapper[4911]: I0310 15:03:36.180330 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/pull/0.log" Mar 10 15:03:36 crc kubenswrapper[4911]: I0310 15:03:36.214088 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/pull/0.log" Mar 10 15:03:36 crc kubenswrapper[4911]: I0310 15:03:36.234377 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/util/0.log" Mar 10 15:03:36 crc kubenswrapper[4911]: I0310 15:03:36.394554 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/util/0.log" Mar 10 15:03:36 crc kubenswrapper[4911]: I0310 15:03:36.433572 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/extract/0.log" Mar 10 15:03:36 crc kubenswrapper[4911]: I0310 15:03:36.433839 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/pull/0.log" Mar 10 15:03:36 crc kubenswrapper[4911]: I0310 15:03:36.927943 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-t2lgw_c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298/manager/0.log" Mar 10 15:03:37 crc kubenswrapper[4911]: I0310 15:03:37.320153 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-z5pz7_6a0bd4c9-4420-48be-9637-67ea2b5c89d1/manager/0.log" Mar 10 15:03:37 crc kubenswrapper[4911]: I0310 15:03:37.374136 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-8mqrm_14dd9547-ff92-4cb4-a055-e41fd390e90e/manager/0.log" Mar 10 15:03:37 crc kubenswrapper[4911]: I0310 15:03:37.635824 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-72kfl_e2b89cc3-8229-4401-b8e9-9a32bffb0f57/manager/0.log" Mar 10 15:03:38 crc kubenswrapper[4911]: I0310 15:03:38.041035 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-hngxq_a937a94a-14cb-4319-9147-d0ac60c5cc6a/manager/0.log" Mar 10 15:03:38 crc kubenswrapper[4911]: I0310 15:03:38.389805 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-t2qf9_06a5238b-e7e1-49a5-9bb8-5f6162183a13/manager/0.log" Mar 10 15:03:38 crc kubenswrapper[4911]: I0310 15:03:38.394716 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-jjsfs_c5336054-5038-40f7-8512-9fe34269f6cd/manager/0.log" Mar 10 15:03:38 crc kubenswrapper[4911]: I0310 15:03:38.717863 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-xw7bn_998d9bc8-11c1-4967-b3d9-1c823d6c41d6/manager/0.log" Mar 10 15:03:38 crc kubenswrapper[4911]: I0310 15:03:38.751927 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-8mq9k_90f49412-d2c3-46ba-9591-5adee9624834/manager/0.log" Mar 10 15:03:38 crc kubenswrapper[4911]: I0310 15:03:38.970420 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-7pcfv_53d2a376-b957-4875-8bfe-42d5dbc0a634/manager/0.log" Mar 10 15:03:39 crc kubenswrapper[4911]: I0310 15:03:39.274491 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-5v9fq_7c70b1a5-051f-43b5-80a3-1b462b9a50f8/manager/0.log" Mar 10 15:03:39 crc kubenswrapper[4911]: I0310 15:03:39.420006 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-57mls_c5080d31-7711-4e4a-9902-4843929a16e9/manager/0.log" Mar 10 15:03:39 crc kubenswrapper[4911]: I0310 15:03:39.562275 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-cqdch_5cf94e3e-325d-4364-bf70-c479683b2be6/manager/0.log" Mar 10 15:03:39 crc kubenswrapper[4911]: I0310 15:03:39.783355 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h_f82c1a17-4dc8-48c2-9bc2-9d7168524de3/manager/0.log" Mar 10 15:03:40 crc kubenswrapper[4911]: I0310 15:03:40.155176 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-554774d6c8-bpzx4_f34f11f5-13c7-426d-b30b-127ddd115a17/operator/0.log" Mar 10 15:03:40 crc kubenswrapper[4911]: I0310 15:03:40.193315 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:03:40 crc kubenswrapper[4911]: E0310 15:03:40.193769 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:03:40 crc kubenswrapper[4911]: I0310 15:03:40.431486 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5f7w8_2ccee19d-0e75-4358-87aa-16359f6bd2ee/registry-server/0.log" Mar 10 15:03:40 crc kubenswrapper[4911]: I0310 15:03:40.639443 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-2mnh2_a036efe0-e6cc-4ebe-8b06-70bc180b7b1c/manager/0.log" Mar 10 15:03:40 crc kubenswrapper[4911]: I0310 15:03:40.802434 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-khnvw_a07393e2-b210-4e68-8cd3-62a838f86071/manager/0.log" Mar 10 15:03:40 crc kubenswrapper[4911]: I0310 15:03:40.976482 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7n8pz_0b94f7c5-35a4-430f-bccb-011f386954d5/operator/0.log" Mar 10 15:03:41 crc kubenswrapper[4911]: I0310 15:03:41.162200 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-8jk8x_fca0a377-f77c-4e24-aec1-8ffb8ba87963/manager/0.log" Mar 10 15:03:41 crc kubenswrapper[4911]: I0310 15:03:41.339711 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-pgf4d_c8487a91-d6ca-480d-a451-35e6516bc9e8/manager/0.log" Mar 10 15:03:41 crc kubenswrapper[4911]: I0310 15:03:41.477165 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-wltgc_902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6/manager/0.log" Mar 10 15:03:41 crc kubenswrapper[4911]: I0310 15:03:41.662671 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-fmrnr_ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20/manager/0.log" Mar 10 15:03:42 crc kubenswrapper[4911]: I0310 15:03:42.037006 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-774dfd9959-g5lwx_0fcc8b66-2a29-45c8-a445-a14770e3f157/manager/0.log" Mar 10 15:03:43 crc kubenswrapper[4911]: I0310 15:03:43.893013 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-wv8p7_0dabe548-2d6c-4bbb-8199-6403e57d2ac9/manager/0.log" Mar 10 15:03:52 crc kubenswrapper[4911]: I0310 15:03:52.193256 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:03:52 crc kubenswrapper[4911]: E0310 15:03:52.194108 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.153321 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552584-x2k99"] Mar 10 15:04:00 crc kubenswrapper[4911]: E0310 15:04:00.154739 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ef9046-bbb2-49fd-97e5-d67ec3057127" containerName="container-00" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.154759 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ef9046-bbb2-49fd-97e5-d67ec3057127" containerName="container-00" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.155051 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ef9046-bbb2-49fd-97e5-d67ec3057127" containerName="container-00" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.156197 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552584-x2k99" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.161988 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.162391 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.164705 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552584-x2k99"] Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.166863 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.171277 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpq29\" (UniqueName: \"kubernetes.io/projected/7ec44bb1-d5c7-422e-9d9f-a1a94d44b259-kube-api-access-tpq29\") pod \"auto-csr-approver-29552584-x2k99\" (UID: \"7ec44bb1-d5c7-422e-9d9f-a1a94d44b259\") " pod="openshift-infra/auto-csr-approver-29552584-x2k99" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.273844 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpq29\" (UniqueName: \"kubernetes.io/projected/7ec44bb1-d5c7-422e-9d9f-a1a94d44b259-kube-api-access-tpq29\") pod \"auto-csr-approver-29552584-x2k99\" (UID: \"7ec44bb1-d5c7-422e-9d9f-a1a94d44b259\") " pod="openshift-infra/auto-csr-approver-29552584-x2k99" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.293685 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpq29\" (UniqueName: \"kubernetes.io/projected/7ec44bb1-d5c7-422e-9d9f-a1a94d44b259-kube-api-access-tpq29\") pod \"auto-csr-approver-29552584-x2k99\" (UID: \"7ec44bb1-d5c7-422e-9d9f-a1a94d44b259\") " pod="openshift-infra/auto-csr-approver-29552584-x2k99" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.482829 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552584-x2k99" Mar 10 15:04:00 crc kubenswrapper[4911]: I0310 15:04:00.984268 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552584-x2k99"] Mar 10 15:04:01 crc kubenswrapper[4911]: I0310 15:04:01.253086 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552584-x2k99" event={"ID":"7ec44bb1-d5c7-422e-9d9f-a1a94d44b259","Type":"ContainerStarted","Data":"d6503428152afe9f794db7a64bae746d288796e8a3acadd074decbfaf4913907"} Mar 10 15:04:01 crc kubenswrapper[4911]: I0310 15:04:01.714629 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4s5dj_61dbbc3f-94f4-4c65-8c0b-7181159fcae3/control-plane-machine-set-operator/0.log" Mar 10 15:04:01 crc kubenswrapper[4911]: I0310 15:04:01.938934 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s5jkn_39208142-b788-4b42-a0f2-421544f8833f/kube-rbac-proxy/0.log" Mar 10 15:04:01 crc kubenswrapper[4911]: I0310 15:04:01.983710 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s5jkn_39208142-b788-4b42-a0f2-421544f8833f/machine-api-operator/0.log" Mar 10 15:04:03 crc kubenswrapper[4911]: I0310 15:04:03.273905 4911 generic.go:334] "Generic (PLEG): container finished" podID="7ec44bb1-d5c7-422e-9d9f-a1a94d44b259" containerID="d70a29bd4a46f8367eaa919ac5de6960ba848defe1ca97faa72de1bc36ae127d" exitCode=0 Mar 10 15:04:03 crc kubenswrapper[4911]: I0310 15:04:03.274003 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552584-x2k99" event={"ID":"7ec44bb1-d5c7-422e-9d9f-a1a94d44b259","Type":"ContainerDied","Data":"d70a29bd4a46f8367eaa919ac5de6960ba848defe1ca97faa72de1bc36ae127d"} Mar 10 15:04:04 crc kubenswrapper[4911]: I0310 15:04:04.673175 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552584-x2k99" Mar 10 15:04:04 crc kubenswrapper[4911]: I0310 15:04:04.875157 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpq29\" (UniqueName: \"kubernetes.io/projected/7ec44bb1-d5c7-422e-9d9f-a1a94d44b259-kube-api-access-tpq29\") pod \"7ec44bb1-d5c7-422e-9d9f-a1a94d44b259\" (UID: \"7ec44bb1-d5c7-422e-9d9f-a1a94d44b259\") " Mar 10 15:04:04 crc kubenswrapper[4911]: I0310 15:04:04.884788 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec44bb1-d5c7-422e-9d9f-a1a94d44b259-kube-api-access-tpq29" (OuterVolumeSpecName: "kube-api-access-tpq29") pod "7ec44bb1-d5c7-422e-9d9f-a1a94d44b259" (UID: "7ec44bb1-d5c7-422e-9d9f-a1a94d44b259"). InnerVolumeSpecName "kube-api-access-tpq29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:04:04 crc kubenswrapper[4911]: I0310 15:04:04.979933 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpq29\" (UniqueName: \"kubernetes.io/projected/7ec44bb1-d5c7-422e-9d9f-a1a94d44b259-kube-api-access-tpq29\") on node \"crc\" DevicePath \"\"" Mar 10 15:04:05 crc kubenswrapper[4911]: I0310 15:04:05.298945 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552584-x2k99" event={"ID":"7ec44bb1-d5c7-422e-9d9f-a1a94d44b259","Type":"ContainerDied","Data":"d6503428152afe9f794db7a64bae746d288796e8a3acadd074decbfaf4913907"} Mar 10 15:04:05 crc kubenswrapper[4911]: I0310 15:04:05.299002 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6503428152afe9f794db7a64bae746d288796e8a3acadd074decbfaf4913907" Mar 10 15:04:05 crc kubenswrapper[4911]: I0310 15:04:05.299074 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552584-x2k99" Mar 10 15:04:05 crc kubenswrapper[4911]: I0310 15:04:05.764351 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552578-j8hft"] Mar 10 15:04:05 crc kubenswrapper[4911]: I0310 15:04:05.777279 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552578-j8hft"] Mar 10 15:04:06 crc kubenswrapper[4911]: I0310 15:04:06.216630 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d142a4-12ea-429f-9755-388522ba6861" path="/var/lib/kubelet/pods/31d142a4-12ea-429f-9755-388522ba6861/volumes" Mar 10 15:04:07 crc kubenswrapper[4911]: I0310 15:04:07.193678 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:04:07 crc kubenswrapper[4911]: E0310 15:04:07.194308 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:04:15 crc kubenswrapper[4911]: I0310 15:04:15.527821 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-t952c_b05a33fd-fd6b-4b1b-ad0f-427586c8e81a/cert-manager-controller/0.log" Mar 10 15:04:15 crc kubenswrapper[4911]: I0310 15:04:15.704142 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-5bw5t_8d1ebd76-111d-461e-8031-13d1071d1e64/cert-manager-cainjector/0.log" Mar 10 15:04:15 crc kubenswrapper[4911]: I0310 15:04:15.750521 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-2fm2x_90feb50a-5cbc-4a77-b328-65b1f3adefc0/cert-manager-webhook/0.log" Mar 10 15:04:18 crc kubenswrapper[4911]: I0310 15:04:18.193874 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:04:18 crc kubenswrapper[4911]: E0310 15:04:18.194447 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:04:28 crc kubenswrapper[4911]: I0310 15:04:28.880650 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-dw2wk_c9cf4f47-5446-49ed-95d1-b3e9322ce43e/nmstate-console-plugin/0.log" Mar 10 15:04:29 crc kubenswrapper[4911]: I0310 15:04:29.069490 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-v89h8_85141c5d-b88f-4970-a62f-e826726facc1/nmstate-handler/0.log" Mar 10 15:04:29 crc kubenswrapper[4911]: I0310 15:04:29.136469 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-hj6l2_892d43d8-c12a-47b3-8056-d0d6024e961e/kube-rbac-proxy/0.log" Mar 10 15:04:29 crc kubenswrapper[4911]: I0310 15:04:29.194515 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:04:29 crc kubenswrapper[4911]: E0310 15:04:29.194829 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:04:29 crc kubenswrapper[4911]: I0310 15:04:29.219709 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-hj6l2_892d43d8-c12a-47b3-8056-d0d6024e961e/nmstate-metrics/0.log" Mar 10 15:04:29 crc kubenswrapper[4911]: I0310 15:04:29.337540 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-rdddd_fdc52c27-9268-47bd-b07e-8d9995db81bb/nmstate-operator/0.log" Mar 10 15:04:29 crc kubenswrapper[4911]: I0310 15:04:29.419024 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-rr2pg_9d21e056-f31e-4a31-a5fe-1543fd7dbc98/nmstate-webhook/0.log" Mar 10 15:04:44 crc kubenswrapper[4911]: I0310 15:04:44.194566 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:04:44 crc kubenswrapper[4911]: E0310 15:04:44.195644 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:04:55 crc kubenswrapper[4911]: I0310 15:04:55.194394 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:04:55 crc kubenswrapper[4911]: E0310 15:04:55.195190 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:04:57 crc kubenswrapper[4911]: I0310 15:04:57.573614 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-78clj_1220afb2-8f3a-4b0c-8b88-9e690005eaf2/kube-rbac-proxy/0.log" Mar 10 15:04:57 crc kubenswrapper[4911]: I0310 15:04:57.758990 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-78clj_1220afb2-8f3a-4b0c-8b88-9e690005eaf2/controller/0.log" Mar 10 15:04:57 crc kubenswrapper[4911]: I0310 15:04:57.798598 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-frr-files/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.047561 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-frr-files/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.048132 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-metrics/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.074199 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-reloader/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.112479 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-reloader/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.269345 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-frr-files/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.310734 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-metrics/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.313902 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-metrics/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.331857 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-reloader/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.545371 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-frr-files/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.577285 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-metrics/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.581521 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/controller/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.601877 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-reloader/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.775381 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/frr-metrics/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.790503 4911 scope.go:117] "RemoveContainer" containerID="a2bbf2cc93217f0adeedde369d3dcb68c1b646d5a6d3b9e763f261262edc67ea" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.821482 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/kube-rbac-proxy-frr/0.log" Mar 10 15:04:58 crc kubenswrapper[4911]: I0310 15:04:58.846934 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/kube-rbac-proxy/0.log" Mar 10 15:04:59 crc kubenswrapper[4911]: I0310 15:04:59.016513 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/reloader/0.log" Mar 10 15:04:59 crc kubenswrapper[4911]: I0310 15:04:59.069330 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-npzpx_26684eb9-05cb-450a-b9b3-225b34518a92/frr-k8s-webhook-server/0.log" Mar 10 15:04:59 crc kubenswrapper[4911]: I0310 15:04:59.290428 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d5b96ddc6-xbpp4_2ab11607-0b3a-4d76-bb7d-34fd3c9fa271/manager/0.log" Mar 10 15:04:59 crc kubenswrapper[4911]: I0310 15:04:59.501617 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6555f87c79-t9n77_bca856e8-824f-41ea-999f-353b10773511/webhook-server/0.log" Mar 10 15:04:59 crc kubenswrapper[4911]: I0310 15:04:59.656945 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w9cpd_884e47b6-07a7-4d77-b73a-ffa7a9a59807/kube-rbac-proxy/0.log" Mar 10 15:05:00 crc kubenswrapper[4911]: I0310 15:05:00.265128 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w9cpd_884e47b6-07a7-4d77-b73a-ffa7a9a59807/speaker/0.log" Mar 10 15:05:00 crc kubenswrapper[4911]: I0310 15:05:00.531287 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/frr/0.log" Mar 10 15:05:07 crc kubenswrapper[4911]: I0310 15:05:07.193091 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:05:07 crc kubenswrapper[4911]: E0310 15:05:07.193781 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:05:14 crc kubenswrapper[4911]: I0310 15:05:14.463467 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/util/0.log" Mar 10 15:05:14 crc kubenswrapper[4911]: I0310 15:05:14.521947 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/util/0.log" Mar 10 15:05:14 crc kubenswrapper[4911]: I0310 15:05:14.657736 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/pull/0.log" Mar 10 15:05:14 crc kubenswrapper[4911]: I0310 15:05:14.709968 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/pull/0.log" Mar 10 15:05:14 crc kubenswrapper[4911]: I0310 15:05:14.887631 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/extract/0.log" Mar 10 15:05:14 crc kubenswrapper[4911]: I0310 15:05:14.906070 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/pull/0.log" Mar 10 15:05:14 crc kubenswrapper[4911]: I0310 15:05:14.949585 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/util/0.log" Mar 10 15:05:15 crc kubenswrapper[4911]: I0310 15:05:15.097427 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-utilities/0.log" Mar 10 15:05:15 crc kubenswrapper[4911]: I0310 15:05:15.277333 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-content/0.log" Mar 10 15:05:15 crc kubenswrapper[4911]: I0310 15:05:15.294372 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-content/0.log" Mar 10 15:05:15 crc kubenswrapper[4911]: I0310 15:05:15.301934 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-utilities/0.log" Mar 10 15:05:15 crc kubenswrapper[4911]: I0310 15:05:15.441058 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-utilities/0.log" Mar 10 15:05:15 crc kubenswrapper[4911]: I0310 15:05:15.496411 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-content/0.log" Mar 10 15:05:15 crc kubenswrapper[4911]: I0310 15:05:15.725078 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-utilities/0.log" Mar 10 15:05:15 crc kubenswrapper[4911]: I0310 15:05:15.969541 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-utilities/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.047990 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-content/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.052134 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-content/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.103669 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/registry-server/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.266247 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-utilities/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.272397 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-content/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.509694 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/util/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.727578 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/util/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.775395 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/pull/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.876111 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/registry-server/0.log" Mar 10 15:05:16 crc kubenswrapper[4911]: I0310 15:05:16.883518 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/pull/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.007701 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/pull/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.022623 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/util/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.076286 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/extract/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.221473 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tvhkw_68abbbcf-c1ce-4be8-9252-9cd985160953/marketplace-operator/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.308935 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-utilities/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.479017 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-content/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.484407 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-content/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.493114 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-utilities/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.659496 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-utilities/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.707216 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-content/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.840854 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/registry-server/0.log" Mar 10 15:05:17 crc kubenswrapper[4911]: I0310 15:05:17.957321 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-utilities/0.log" Mar 10 15:05:18 crc kubenswrapper[4911]: I0310 15:05:18.116821 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-utilities/0.log" Mar 10 15:05:18 crc kubenswrapper[4911]: I0310 15:05:18.116842 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-content/0.log" Mar 10 15:05:18 crc kubenswrapper[4911]: I0310 15:05:18.129697 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-content/0.log" Mar 10 15:05:18 crc kubenswrapper[4911]: I0310 15:05:18.298540 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-utilities/0.log" Mar 10 15:05:18 crc kubenswrapper[4911]: I0310 15:05:18.325122 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-content/0.log" Mar 10 15:05:18 crc kubenswrapper[4911]: I0310 15:05:18.802420 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/registry-server/0.log" Mar 10 15:05:22 crc kubenswrapper[4911]: I0310 15:05:22.193597 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:05:22 crc kubenswrapper[4911]: E0310 15:05:22.193955 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:05:37 crc kubenswrapper[4911]: I0310 15:05:37.193835 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:05:37 crc kubenswrapper[4911]: E0310 15:05:37.194711 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:05:50 crc kubenswrapper[4911]: E0310 15:05:50.336530 4911 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.153:41928->38.102.83.153:36859: write tcp 38.102.83.153:41928->38.102.83.153:36859: write: broken pipe Mar 10 15:05:51 crc kubenswrapper[4911]: I0310 15:05:51.193798 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:05:51 crc kubenswrapper[4911]: E0310 15:05:51.194139 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.156561 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552586-8fv8m"] Mar 10 15:06:00 crc kubenswrapper[4911]: E0310 15:06:00.157826 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec44bb1-d5c7-422e-9d9f-a1a94d44b259" containerName="oc" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.157844 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec44bb1-d5c7-422e-9d9f-a1a94d44b259" containerName="oc" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.158041 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec44bb1-d5c7-422e-9d9f-a1a94d44b259" containerName="oc" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.159194 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552586-8fv8m" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.166358 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.166753 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.166942 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.184347 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552586-8fv8m"] Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.203126 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x49s2\" (UniqueName: \"kubernetes.io/projected/d11552a5-7768-424e-a30b-49e247d09194-kube-api-access-x49s2\") pod \"auto-csr-approver-29552586-8fv8m\" (UID: \"d11552a5-7768-424e-a30b-49e247d09194\") " pod="openshift-infra/auto-csr-approver-29552586-8fv8m" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.305223 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x49s2\" (UniqueName: \"kubernetes.io/projected/d11552a5-7768-424e-a30b-49e247d09194-kube-api-access-x49s2\") pod \"auto-csr-approver-29552586-8fv8m\" (UID: \"d11552a5-7768-424e-a30b-49e247d09194\") " pod="openshift-infra/auto-csr-approver-29552586-8fv8m" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.329372 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x49s2\" (UniqueName: \"kubernetes.io/projected/d11552a5-7768-424e-a30b-49e247d09194-kube-api-access-x49s2\") pod \"auto-csr-approver-29552586-8fv8m\" (UID: \"d11552a5-7768-424e-a30b-49e247d09194\") " pod="openshift-infra/auto-csr-approver-29552586-8fv8m" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.490213 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552586-8fv8m" Mar 10 15:06:00 crc kubenswrapper[4911]: I0310 15:06:00.962477 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552586-8fv8m"] Mar 10 15:06:01 crc kubenswrapper[4911]: I0310 15:06:01.580514 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552586-8fv8m" event={"ID":"d11552a5-7768-424e-a30b-49e247d09194","Type":"ContainerStarted","Data":"197aadc63000ccc403cf34949793ced231fdb51a5985273c82106a23567f19cf"} Mar 10 15:06:02 crc kubenswrapper[4911]: I0310 15:06:02.592081 4911 generic.go:334] "Generic (PLEG): container finished" podID="d11552a5-7768-424e-a30b-49e247d09194" containerID="05c64525355e2e35fe082f05c60ccc834ba8560fb4e2479223ba37dae76e9c0d" exitCode=0 Mar 10 15:06:02 crc kubenswrapper[4911]: I0310 15:06:02.592177 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552586-8fv8m" event={"ID":"d11552a5-7768-424e-a30b-49e247d09194","Type":"ContainerDied","Data":"05c64525355e2e35fe082f05c60ccc834ba8560fb4e2479223ba37dae76e9c0d"} Mar 10 15:06:03 crc kubenswrapper[4911]: I0310 15:06:03.194396 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:06:03 crc kubenswrapper[4911]: E0310 15:06:03.194615 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:06:03 crc kubenswrapper[4911]: I0310 15:06:03.958970 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552586-8fv8m" Mar 10 15:06:03 crc kubenswrapper[4911]: I0310 15:06:03.992448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x49s2\" (UniqueName: \"kubernetes.io/projected/d11552a5-7768-424e-a30b-49e247d09194-kube-api-access-x49s2\") pod \"d11552a5-7768-424e-a30b-49e247d09194\" (UID: \"d11552a5-7768-424e-a30b-49e247d09194\") " Mar 10 15:06:04 crc kubenswrapper[4911]: I0310 15:06:04.006990 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11552a5-7768-424e-a30b-49e247d09194-kube-api-access-x49s2" (OuterVolumeSpecName: "kube-api-access-x49s2") pod "d11552a5-7768-424e-a30b-49e247d09194" (UID: "d11552a5-7768-424e-a30b-49e247d09194"). InnerVolumeSpecName "kube-api-access-x49s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:04 crc kubenswrapper[4911]: I0310 15:06:04.095971 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x49s2\" (UniqueName: \"kubernetes.io/projected/d11552a5-7768-424e-a30b-49e247d09194-kube-api-access-x49s2\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:04 crc kubenswrapper[4911]: I0310 15:06:04.628886 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552586-8fv8m" event={"ID":"d11552a5-7768-424e-a30b-49e247d09194","Type":"ContainerDied","Data":"197aadc63000ccc403cf34949793ced231fdb51a5985273c82106a23567f19cf"} Mar 10 15:06:04 crc kubenswrapper[4911]: I0310 15:06:04.629274 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="197aadc63000ccc403cf34949793ced231fdb51a5985273c82106a23567f19cf" Mar 10 15:06:04 crc kubenswrapper[4911]: I0310 15:06:04.629005 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552586-8fv8m" Mar 10 15:06:05 crc kubenswrapper[4911]: I0310 15:06:05.045266 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552580-h4647"] Mar 10 15:06:05 crc kubenswrapper[4911]: I0310 15:06:05.057200 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552580-h4647"] Mar 10 15:06:06 crc kubenswrapper[4911]: I0310 15:06:06.211672 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="766a5fea-e025-4ad7-94aa-037143ddf8a3" path="/var/lib/kubelet/pods/766a5fea-e025-4ad7-94aa-037143ddf8a3/volumes" Mar 10 15:06:18 crc kubenswrapper[4911]: I0310 15:06:18.193188 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:06:18 crc kubenswrapper[4911]: E0310 15:06:18.194098 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:06:31 crc kubenswrapper[4911]: I0310 15:06:31.194554 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:06:31 crc kubenswrapper[4911]: E0310 15:06:31.195340 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:06:46 crc kubenswrapper[4911]: I0310 15:06:46.207756 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:06:46 crc kubenswrapper[4911]: E0310 15:06:46.208849 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:06:58 crc kubenswrapper[4911]: I0310 15:06:58.193744 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:06:58 crc kubenswrapper[4911]: E0310 15:06:58.194976 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:06:58 crc kubenswrapper[4911]: I0310 15:06:58.919671 4911 scope.go:117] "RemoveContainer" containerID="ce855d5066d81124ceb3d1e96738cc58ceec7cc7a3f348d260388fd082163256" Mar 10 15:07:13 crc kubenswrapper[4911]: I0310 15:07:13.194369 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:07:13 crc kubenswrapper[4911]: E0310 15:07:13.195148 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:07:14 crc kubenswrapper[4911]: I0310 15:07:14.716731 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z9b4h"] Mar 10 15:07:14 crc kubenswrapper[4911]: E0310 15:07:14.718967 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11552a5-7768-424e-a30b-49e247d09194" containerName="oc" Mar 10 15:07:14 crc kubenswrapper[4911]: I0310 15:07:14.718996 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11552a5-7768-424e-a30b-49e247d09194" containerName="oc" Mar 10 15:07:14 crc kubenswrapper[4911]: I0310 15:07:14.719459 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11552a5-7768-424e-a30b-49e247d09194" containerName="oc" Mar 10 15:07:14 crc kubenswrapper[4911]: I0310 15:07:14.726099 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:14 crc kubenswrapper[4911]: I0310 15:07:14.736394 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z9b4h"] Mar 10 15:07:14 crc kubenswrapper[4911]: I0310 15:07:14.918970 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-utilities\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:14 crc kubenswrapper[4911]: I0310 15:07:14.919299 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk5fn\" (UniqueName: \"kubernetes.io/projected/567f7524-d796-4fd8-b37c-819275477e25-kube-api-access-wk5fn\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:14 crc kubenswrapper[4911]: I0310 15:07:14.919395 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-catalog-content\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:15 crc kubenswrapper[4911]: I0310 15:07:15.022181 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-catalog-content\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:15 crc kubenswrapper[4911]: I0310 15:07:15.022269 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-utilities\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:15 crc kubenswrapper[4911]: I0310 15:07:15.022333 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk5fn\" (UniqueName: \"kubernetes.io/projected/567f7524-d796-4fd8-b37c-819275477e25-kube-api-access-wk5fn\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:15 crc kubenswrapper[4911]: I0310 15:07:15.022779 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-catalog-content\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:15 crc kubenswrapper[4911]: I0310 15:07:15.022896 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-utilities\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:15 crc kubenswrapper[4911]: I0310 15:07:15.049595 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk5fn\" (UniqueName: \"kubernetes.io/projected/567f7524-d796-4fd8-b37c-819275477e25-kube-api-access-wk5fn\") pod \"redhat-operators-z9b4h\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:15 crc kubenswrapper[4911]: I0310 15:07:15.057468 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:15 crc kubenswrapper[4911]: I0310 15:07:15.701879 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z9b4h"] Mar 10 15:07:16 crc kubenswrapper[4911]: I0310 15:07:16.463390 4911 generic.go:334] "Generic (PLEG): container finished" podID="567f7524-d796-4fd8-b37c-819275477e25" containerID="fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120" exitCode=0 Mar 10 15:07:16 crc kubenswrapper[4911]: I0310 15:07:16.463450 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9b4h" event={"ID":"567f7524-d796-4fd8-b37c-819275477e25","Type":"ContainerDied","Data":"fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120"} Mar 10 15:07:16 crc kubenswrapper[4911]: I0310 15:07:16.463713 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9b4h" event={"ID":"567f7524-d796-4fd8-b37c-819275477e25","Type":"ContainerStarted","Data":"51f5dd5162945e5f7af0003dbfc3efdcdad4072d0c525a3b745700ed6f25c05e"} Mar 10 15:07:16 crc kubenswrapper[4911]: I0310 15:07:16.466610 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:07:18 crc kubenswrapper[4911]: I0310 15:07:18.486562 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9b4h" event={"ID":"567f7524-d796-4fd8-b37c-819275477e25","Type":"ContainerStarted","Data":"fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb"} Mar 10 15:07:19 crc kubenswrapper[4911]: I0310 15:07:19.510362 4911 generic.go:334] "Generic (PLEG): container finished" podID="567f7524-d796-4fd8-b37c-819275477e25" containerID="fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb" exitCode=0 Mar 10 15:07:19 crc kubenswrapper[4911]: I0310 15:07:19.510455 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9b4h" event={"ID":"567f7524-d796-4fd8-b37c-819275477e25","Type":"ContainerDied","Data":"fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb"} Mar 10 15:07:20 crc kubenswrapper[4911]: I0310 15:07:20.523603 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9b4h" event={"ID":"567f7524-d796-4fd8-b37c-819275477e25","Type":"ContainerStarted","Data":"ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636"} Mar 10 15:07:20 crc kubenswrapper[4911]: I0310 15:07:20.548790 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z9b4h" podStartSLOduration=3.083301684 podStartE2EDuration="6.54876738s" podCreationTimestamp="2026-03-10 15:07:14 +0000 UTC" firstStartedPulling="2026-03-10 15:07:16.466247214 +0000 UTC m=+3941.029767141" lastFinishedPulling="2026-03-10 15:07:19.93171292 +0000 UTC m=+3944.495232837" observedRunningTime="2026-03-10 15:07:20.543936875 +0000 UTC m=+3945.107456802" watchObservedRunningTime="2026-03-10 15:07:20.54876738 +0000 UTC m=+3945.112287307" Mar 10 15:07:21 crc kubenswrapper[4911]: I0310 15:07:21.536560 4911 generic.go:334] "Generic (PLEG): container finished" podID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerID="c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77" exitCode=0 Mar 10 15:07:21 crc kubenswrapper[4911]: I0310 15:07:21.536774 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" event={"ID":"6ad1bc14-a3de-4320-90e3-7c49c5f3014f","Type":"ContainerDied","Data":"c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77"} Mar 10 15:07:21 crc kubenswrapper[4911]: I0310 15:07:21.538401 4911 scope.go:117] "RemoveContainer" containerID="c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77" Mar 10 15:07:22 crc kubenswrapper[4911]: I0310 15:07:22.351420 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4vz7n_must-gather-zmqcp_6ad1bc14-a3de-4320-90e3-7c49c5f3014f/gather/0.log" Mar 10 15:07:25 crc kubenswrapper[4911]: I0310 15:07:25.058877 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:25 crc kubenswrapper[4911]: I0310 15:07:25.059239 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:25 crc kubenswrapper[4911]: I0310 15:07:25.193428 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:07:25 crc kubenswrapper[4911]: E0310 15:07:25.193743 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:07:26 crc kubenswrapper[4911]: I0310 15:07:26.105539 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z9b4h" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="registry-server" probeResult="failure" output=< Mar 10 15:07:26 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Mar 10 15:07:26 crc kubenswrapper[4911]: > Mar 10 15:07:30 crc kubenswrapper[4911]: I0310 15:07:30.537162 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4vz7n/must-gather-zmqcp"] Mar 10 15:07:30 crc kubenswrapper[4911]: I0310 15:07:30.538618 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" podUID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerName="copy" containerID="cri-o://8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978" gracePeriod=2 Mar 10 15:07:30 crc kubenswrapper[4911]: I0310 15:07:30.545909 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4vz7n/must-gather-zmqcp"] Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.035259 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4vz7n_must-gather-zmqcp_6ad1bc14-a3de-4320-90e3-7c49c5f3014f/copy/0.log" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.035851 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.211106 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trxvb\" (UniqueName: \"kubernetes.io/projected/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-kube-api-access-trxvb\") pod \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\" (UID: \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\") " Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.211301 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-must-gather-output\") pod \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\" (UID: \"6ad1bc14-a3de-4320-90e3-7c49c5f3014f\") " Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.228740 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-kube-api-access-trxvb" (OuterVolumeSpecName: "kube-api-access-trxvb") pod "6ad1bc14-a3de-4320-90e3-7c49c5f3014f" (UID: "6ad1bc14-a3de-4320-90e3-7c49c5f3014f"). InnerVolumeSpecName "kube-api-access-trxvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.315682 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trxvb\" (UniqueName: \"kubernetes.io/projected/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-kube-api-access-trxvb\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.398208 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6ad1bc14-a3de-4320-90e3-7c49c5f3014f" (UID: "6ad1bc14-a3de-4320-90e3-7c49c5f3014f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.418031 4911 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ad1bc14-a3de-4320-90e3-7c49c5f3014f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.642206 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4vz7n_must-gather-zmqcp_6ad1bc14-a3de-4320-90e3-7c49c5f3014f/copy/0.log" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.642866 4911 generic.go:334] "Generic (PLEG): container finished" podID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerID="8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978" exitCode=143 Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.642953 4911 scope.go:117] "RemoveContainer" containerID="8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.642972 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vz7n/must-gather-zmqcp" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.677853 4911 scope.go:117] "RemoveContainer" containerID="c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.810444 4911 scope.go:117] "RemoveContainer" containerID="8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978" Mar 10 15:07:31 crc kubenswrapper[4911]: E0310 15:07:31.811458 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978\": container with ID starting with 8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978 not found: ID does not exist" containerID="8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.811518 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978"} err="failed to get container status \"8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978\": rpc error: code = NotFound desc = could not find container \"8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978\": container with ID starting with 8d9bdaebb92a0f19e4b6ac4eafabc39e3412734fd06b8143f5f8df3d54f5f978 not found: ID does not exist" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.811550 4911 scope.go:117] "RemoveContainer" containerID="c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77" Mar 10 15:07:31 crc kubenswrapper[4911]: E0310 15:07:31.812018 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77\": container with ID starting with c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77 not found: ID does not exist" containerID="c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77" Mar 10 15:07:31 crc kubenswrapper[4911]: I0310 15:07:31.812063 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77"} err="failed to get container status \"c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77\": rpc error: code = NotFound desc = could not find container \"c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77\": container with ID starting with c50b5b11320830b035be5af0b1d5028df38b1d2ce1edd5d6a421a748314bec77 not found: ID does not exist" Mar 10 15:07:32 crc kubenswrapper[4911]: I0310 15:07:32.205062 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" path="/var/lib/kubelet/pods/6ad1bc14-a3de-4320-90e3-7c49c5f3014f/volumes" Mar 10 15:07:35 crc kubenswrapper[4911]: I0310 15:07:35.112932 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:35 crc kubenswrapper[4911]: I0310 15:07:35.172210 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:35 crc kubenswrapper[4911]: I0310 15:07:35.354098 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z9b4h"] Mar 10 15:07:36 crc kubenswrapper[4911]: I0310 15:07:36.693892 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z9b4h" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="registry-server" containerID="cri-o://ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636" gracePeriod=2 Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.191075 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.388550 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-catalog-content\") pod \"567f7524-d796-4fd8-b37c-819275477e25\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.388718 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk5fn\" (UniqueName: \"kubernetes.io/projected/567f7524-d796-4fd8-b37c-819275477e25-kube-api-access-wk5fn\") pod \"567f7524-d796-4fd8-b37c-819275477e25\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.388988 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-utilities\") pod \"567f7524-d796-4fd8-b37c-819275477e25\" (UID: \"567f7524-d796-4fd8-b37c-819275477e25\") " Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.389986 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-utilities" (OuterVolumeSpecName: "utilities") pod "567f7524-d796-4fd8-b37c-819275477e25" (UID: "567f7524-d796-4fd8-b37c-819275477e25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.401215 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567f7524-d796-4fd8-b37c-819275477e25-kube-api-access-wk5fn" (OuterVolumeSpecName: "kube-api-access-wk5fn") pod "567f7524-d796-4fd8-b37c-819275477e25" (UID: "567f7524-d796-4fd8-b37c-819275477e25"). InnerVolumeSpecName "kube-api-access-wk5fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.491863 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.491904 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk5fn\" (UniqueName: \"kubernetes.io/projected/567f7524-d796-4fd8-b37c-819275477e25-kube-api-access-wk5fn\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.545239 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "567f7524-d796-4fd8-b37c-819275477e25" (UID: "567f7524-d796-4fd8-b37c-819275477e25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.593622 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567f7524-d796-4fd8-b37c-819275477e25-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.707055 4911 generic.go:334] "Generic (PLEG): container finished" podID="567f7524-d796-4fd8-b37c-819275477e25" containerID="ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636" exitCode=0 Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.707160 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z9b4h" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.707172 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9b4h" event={"ID":"567f7524-d796-4fd8-b37c-819275477e25","Type":"ContainerDied","Data":"ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636"} Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.708045 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9b4h" event={"ID":"567f7524-d796-4fd8-b37c-819275477e25","Type":"ContainerDied","Data":"51f5dd5162945e5f7af0003dbfc3efdcdad4072d0c525a3b745700ed6f25c05e"} Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.708074 4911 scope.go:117] "RemoveContainer" containerID="ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.730020 4911 scope.go:117] "RemoveContainer" containerID="fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.760820 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z9b4h"] Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.773220 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z9b4h"] Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.790609 4911 scope.go:117] "RemoveContainer" containerID="fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.815303 4911 scope.go:117] "RemoveContainer" containerID="ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636" Mar 10 15:07:37 crc kubenswrapper[4911]: E0310 15:07:37.815830 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636\": container with ID starting with ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636 not found: ID does not exist" containerID="ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.815865 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636"} err="failed to get container status \"ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636\": rpc error: code = NotFound desc = could not find container \"ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636\": container with ID starting with ec00ad5cba478d9457f0f3b9a5ea5122ca871987f7a5104ab4723b72d033e636 not found: ID does not exist" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.815901 4911 scope.go:117] "RemoveContainer" containerID="fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb" Mar 10 15:07:37 crc kubenswrapper[4911]: E0310 15:07:37.816437 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb\": container with ID starting with fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb not found: ID does not exist" containerID="fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.816511 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb"} err="failed to get container status \"fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb\": rpc error: code = NotFound desc = could not find container \"fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb\": container with ID starting with fb731a79cd0fc894c7451af386e5024071cba2c50369e9851c5e5caeaee782cb not found: ID does not exist" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.816550 4911 scope.go:117] "RemoveContainer" containerID="fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120" Mar 10 15:07:37 crc kubenswrapper[4911]: E0310 15:07:37.816896 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120\": container with ID starting with fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120 not found: ID does not exist" containerID="fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120" Mar 10 15:07:37 crc kubenswrapper[4911]: I0310 15:07:37.816926 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120"} err="failed to get container status \"fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120\": rpc error: code = NotFound desc = could not find container \"fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120\": container with ID starting with fb1bd87c4317d364cd41285c76afa6db7b70dd155cd3380323ce85bd78291120 not found: ID does not exist" Mar 10 15:07:38 crc kubenswrapper[4911]: I0310 15:07:38.205381 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567f7524-d796-4fd8-b37c-819275477e25" path="/var/lib/kubelet/pods/567f7524-d796-4fd8-b37c-819275477e25/volumes" Mar 10 15:07:40 crc kubenswrapper[4911]: I0310 15:07:40.193699 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:07:40 crc kubenswrapper[4911]: E0310 15:07:40.194963 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:07:52 crc kubenswrapper[4911]: I0310 15:07:52.194929 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:07:52 crc kubenswrapper[4911]: I0310 15:07:52.892155 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"07403a5bc42ca34f5ff0c49121e95c611f228b955cde52cc7ca9346c9d02ab86"} Mar 10 15:07:59 crc kubenswrapper[4911]: I0310 15:07:59.015023 4911 scope.go:117] "RemoveContainer" containerID="ca41a574ddec81823410196108087996467f7c6edb130030fed788baea706471" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.148635 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552588-vfx7l"] Mar 10 15:08:00 crc kubenswrapper[4911]: E0310 15:08:00.149597 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="registry-server" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.149617 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="registry-server" Mar 10 15:08:00 crc kubenswrapper[4911]: E0310 15:08:00.149644 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="extract-content" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.149651 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="extract-content" Mar 10 15:08:00 crc kubenswrapper[4911]: E0310 15:08:00.149666 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="extract-utilities" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.149673 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="extract-utilities" Mar 10 15:08:00 crc kubenswrapper[4911]: E0310 15:08:00.149700 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerName="gather" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.149707 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerName="gather" Mar 10 15:08:00 crc kubenswrapper[4911]: E0310 15:08:00.149746 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerName="copy" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.149777 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerName="copy" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.150043 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="567f7524-d796-4fd8-b37c-819275477e25" containerName="registry-server" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.150078 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerName="gather" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.150107 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad1bc14-a3de-4320-90e3-7c49c5f3014f" containerName="copy" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.151077 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-vfx7l" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.158019 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.158324 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.158474 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.159774 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-vfx7l"] Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.210900 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxsqz\" (UniqueName: \"kubernetes.io/projected/59c685a3-6149-4c66-aec6-48ce0436ac47-kube-api-access-gxsqz\") pod \"auto-csr-approver-29552588-vfx7l\" (UID: \"59c685a3-6149-4c66-aec6-48ce0436ac47\") " pod="openshift-infra/auto-csr-approver-29552588-vfx7l" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.312717 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxsqz\" (UniqueName: \"kubernetes.io/projected/59c685a3-6149-4c66-aec6-48ce0436ac47-kube-api-access-gxsqz\") pod \"auto-csr-approver-29552588-vfx7l\" (UID: \"59c685a3-6149-4c66-aec6-48ce0436ac47\") " pod="openshift-infra/auto-csr-approver-29552588-vfx7l" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.333545 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxsqz\" (UniqueName: \"kubernetes.io/projected/59c685a3-6149-4c66-aec6-48ce0436ac47-kube-api-access-gxsqz\") pod \"auto-csr-approver-29552588-vfx7l\" (UID: \"59c685a3-6149-4c66-aec6-48ce0436ac47\") " pod="openshift-infra/auto-csr-approver-29552588-vfx7l" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.472825 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-vfx7l" Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.931083 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-vfx7l"] Mar 10 15:08:00 crc kubenswrapper[4911]: I0310 15:08:00.985372 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-vfx7l" event={"ID":"59c685a3-6149-4c66-aec6-48ce0436ac47","Type":"ContainerStarted","Data":"6e2de45b5bd544d7bde8ea39c5d14506350558bd7c193149777c5fe80ab35dd7"} Mar 10 15:08:03 crc kubenswrapper[4911]: I0310 15:08:03.006531 4911 generic.go:334] "Generic (PLEG): container finished" podID="59c685a3-6149-4c66-aec6-48ce0436ac47" containerID="d7fb5c1bb7a41a70787011854386956fad8cbf6052fe82695bc11fce6dd62455" exitCode=0 Mar 10 15:08:03 crc kubenswrapper[4911]: I0310 15:08:03.006590 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-vfx7l" event={"ID":"59c685a3-6149-4c66-aec6-48ce0436ac47","Type":"ContainerDied","Data":"d7fb5c1bb7a41a70787011854386956fad8cbf6052fe82695bc11fce6dd62455"} Mar 10 15:08:04 crc kubenswrapper[4911]: I0310 15:08:04.408764 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-vfx7l" Mar 10 15:08:04 crc kubenswrapper[4911]: I0310 15:08:04.477759 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxsqz\" (UniqueName: \"kubernetes.io/projected/59c685a3-6149-4c66-aec6-48ce0436ac47-kube-api-access-gxsqz\") pod \"59c685a3-6149-4c66-aec6-48ce0436ac47\" (UID: \"59c685a3-6149-4c66-aec6-48ce0436ac47\") " Mar 10 15:08:04 crc kubenswrapper[4911]: I0310 15:08:04.495406 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c685a3-6149-4c66-aec6-48ce0436ac47-kube-api-access-gxsqz" (OuterVolumeSpecName: "kube-api-access-gxsqz") pod "59c685a3-6149-4c66-aec6-48ce0436ac47" (UID: "59c685a3-6149-4c66-aec6-48ce0436ac47"). InnerVolumeSpecName "kube-api-access-gxsqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:08:04 crc kubenswrapper[4911]: I0310 15:08:04.580194 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxsqz\" (UniqueName: \"kubernetes.io/projected/59c685a3-6149-4c66-aec6-48ce0436ac47-kube-api-access-gxsqz\") on node \"crc\" DevicePath \"\"" Mar 10 15:08:05 crc kubenswrapper[4911]: I0310 15:08:05.027442 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-vfx7l" event={"ID":"59c685a3-6149-4c66-aec6-48ce0436ac47","Type":"ContainerDied","Data":"6e2de45b5bd544d7bde8ea39c5d14506350558bd7c193149777c5fe80ab35dd7"} Mar 10 15:08:05 crc kubenswrapper[4911]: I0310 15:08:05.027489 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e2de45b5bd544d7bde8ea39c5d14506350558bd7c193149777c5fe80ab35dd7" Mar 10 15:08:05 crc kubenswrapper[4911]: I0310 15:08:05.027556 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-vfx7l" Mar 10 15:08:05 crc kubenswrapper[4911]: I0310 15:08:05.557577 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552582-qm5rq"] Mar 10 15:08:05 crc kubenswrapper[4911]: I0310 15:08:05.574475 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552582-qm5rq"] Mar 10 15:08:06 crc kubenswrapper[4911]: I0310 15:08:06.211207 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf406f01-5c7f-4439-8061-c41d07896874" path="/var/lib/kubelet/pods/cf406f01-5c7f-4439-8061-c41d07896874/volumes" Mar 10 15:08:59 crc kubenswrapper[4911]: I0310 15:08:59.143492 4911 scope.go:117] "RemoveContainer" containerID="937980658f5d1594d1f0fa3a173b36f09f330b5869b0918be8f31942e949a20c" Mar 10 15:08:59 crc kubenswrapper[4911]: I0310 15:08:59.181969 4911 scope.go:117] "RemoveContainer" containerID="0495e2552c3e735b6e7b76824918427b15f9dc9fb93bad6cd0394544eed53aaa" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.431241 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ckrzf"] Mar 10 15:09:57 crc kubenswrapper[4911]: E0310 15:09:57.432372 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c685a3-6149-4c66-aec6-48ce0436ac47" containerName="oc" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.432391 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c685a3-6149-4c66-aec6-48ce0436ac47" containerName="oc" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.432608 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c685a3-6149-4c66-aec6-48ce0436ac47" containerName="oc" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.434273 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.446181 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ckrzf"] Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.573189 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-catalog-content\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.573316 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48xql\" (UniqueName: \"kubernetes.io/projected/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-kube-api-access-48xql\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.573387 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-utilities\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.675020 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-catalog-content\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.675097 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48xql\" (UniqueName: \"kubernetes.io/projected/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-kube-api-access-48xql\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.675128 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-utilities\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.676344 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-utilities\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.676714 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-catalog-content\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.697380 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48xql\" (UniqueName: \"kubernetes.io/projected/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-kube-api-access-48xql\") pod \"community-operators-ckrzf\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:57 crc kubenswrapper[4911]: I0310 15:09:57.756674 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:09:58 crc kubenswrapper[4911]: I0310 15:09:58.289085 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ckrzf"] Mar 10 15:09:58 crc kubenswrapper[4911]: I0310 15:09:58.671113 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckrzf" event={"ID":"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b","Type":"ContainerStarted","Data":"eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec"} Mar 10 15:09:58 crc kubenswrapper[4911]: I0310 15:09:58.671452 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckrzf" event={"ID":"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b","Type":"ContainerStarted","Data":"ccd38e1e38896c30954406f0425188b9a6623a3dd38d2058d4e3b5c7d67491e0"} Mar 10 15:09:59 crc kubenswrapper[4911]: I0310 15:09:59.683690 4911 generic.go:334] "Generic (PLEG): container finished" podID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerID="eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec" exitCode=0 Mar 10 15:09:59 crc kubenswrapper[4911]: I0310 15:09:59.684255 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckrzf" event={"ID":"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b","Type":"ContainerDied","Data":"eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec"} Mar 10 15:09:59 crc kubenswrapper[4911]: I0310 15:09:59.684303 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckrzf" event={"ID":"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b","Type":"ContainerStarted","Data":"f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3"} Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.157098 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552590-qcxxt"] Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.159022 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-qcxxt" Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.161371 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.162001 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.162242 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.169067 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-qcxxt"] Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.241637 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn99r\" (UniqueName: \"kubernetes.io/projected/85427124-2b71-420b-856e-863aa5c6c5be-kube-api-access-xn99r\") pod \"auto-csr-approver-29552590-qcxxt\" (UID: \"85427124-2b71-420b-856e-863aa5c6c5be\") " pod="openshift-infra/auto-csr-approver-29552590-qcxxt" Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.343352 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn99r\" (UniqueName: \"kubernetes.io/projected/85427124-2b71-420b-856e-863aa5c6c5be-kube-api-access-xn99r\") pod \"auto-csr-approver-29552590-qcxxt\" (UID: \"85427124-2b71-420b-856e-863aa5c6c5be\") " pod="openshift-infra/auto-csr-approver-29552590-qcxxt" Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.366436 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn99r\" (UniqueName: \"kubernetes.io/projected/85427124-2b71-420b-856e-863aa5c6c5be-kube-api-access-xn99r\") pod \"auto-csr-approver-29552590-qcxxt\" (UID: \"85427124-2b71-420b-856e-863aa5c6c5be\") " pod="openshift-infra/auto-csr-approver-29552590-qcxxt" Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.477350 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-qcxxt" Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.696336 4911 generic.go:334] "Generic (PLEG): container finished" podID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerID="f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3" exitCode=0 Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.696392 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckrzf" event={"ID":"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b","Type":"ContainerDied","Data":"f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3"} Mar 10 15:10:00 crc kubenswrapper[4911]: I0310 15:10:00.959943 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-qcxxt"] Mar 10 15:10:00 crc kubenswrapper[4911]: W0310 15:10:00.961534 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85427124_2b71_420b_856e_863aa5c6c5be.slice/crio-e85d58b8d5f7eef252399dad3f58de2f7d7efd9adf4286b875ac1c700ca96fba WatchSource:0}: Error finding container e85d58b8d5f7eef252399dad3f58de2f7d7efd9adf4286b875ac1c700ca96fba: Status 404 returned error can't find the container with id e85d58b8d5f7eef252399dad3f58de2f7d7efd9adf4286b875ac1c700ca96fba Mar 10 15:10:01 crc kubenswrapper[4911]: I0310 15:10:01.708123 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-qcxxt" event={"ID":"85427124-2b71-420b-856e-863aa5c6c5be","Type":"ContainerStarted","Data":"e85d58b8d5f7eef252399dad3f58de2f7d7efd9adf4286b875ac1c700ca96fba"} Mar 10 15:10:01 crc kubenswrapper[4911]: I0310 15:10:01.743533 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckrzf" event={"ID":"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b","Type":"ContainerStarted","Data":"dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c"} Mar 10 15:10:01 crc kubenswrapper[4911]: I0310 15:10:01.777190 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ckrzf" podStartSLOduration=2.263588429 podStartE2EDuration="4.777161931s" podCreationTimestamp="2026-03-10 15:09:57 +0000 UTC" firstStartedPulling="2026-03-10 15:09:58.673634599 +0000 UTC m=+4103.237154516" lastFinishedPulling="2026-03-10 15:10:01.187208101 +0000 UTC m=+4105.750728018" observedRunningTime="2026-03-10 15:10:01.764444215 +0000 UTC m=+4106.327964132" watchObservedRunningTime="2026-03-10 15:10:01.777161931 +0000 UTC m=+4106.340681858" Mar 10 15:10:02 crc kubenswrapper[4911]: I0310 15:10:02.756249 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-qcxxt" event={"ID":"85427124-2b71-420b-856e-863aa5c6c5be","Type":"ContainerStarted","Data":"8312e4d97553517362fd42ffd33f0040bcaa6a5e00af0fa1e94a31587db9f12f"} Mar 10 15:10:03 crc kubenswrapper[4911]: I0310 15:10:03.768215 4911 generic.go:334] "Generic (PLEG): container finished" podID="85427124-2b71-420b-856e-863aa5c6c5be" containerID="8312e4d97553517362fd42ffd33f0040bcaa6a5e00af0fa1e94a31587db9f12f" exitCode=0 Mar 10 15:10:03 crc kubenswrapper[4911]: I0310 15:10:03.768322 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-qcxxt" event={"ID":"85427124-2b71-420b-856e-863aa5c6c5be","Type":"ContainerDied","Data":"8312e4d97553517362fd42ffd33f0040bcaa6a5e00af0fa1e94a31587db9f12f"} Mar 10 15:10:05 crc kubenswrapper[4911]: I0310 15:10:05.158006 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-qcxxt" Mar 10 15:10:05 crc kubenswrapper[4911]: I0310 15:10:05.263749 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn99r\" (UniqueName: \"kubernetes.io/projected/85427124-2b71-420b-856e-863aa5c6c5be-kube-api-access-xn99r\") pod \"85427124-2b71-420b-856e-863aa5c6c5be\" (UID: \"85427124-2b71-420b-856e-863aa5c6c5be\") " Mar 10 15:10:05 crc kubenswrapper[4911]: I0310 15:10:05.269082 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85427124-2b71-420b-856e-863aa5c6c5be-kube-api-access-xn99r" (OuterVolumeSpecName: "kube-api-access-xn99r") pod "85427124-2b71-420b-856e-863aa5c6c5be" (UID: "85427124-2b71-420b-856e-863aa5c6c5be"). InnerVolumeSpecName "kube-api-access-xn99r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:05 crc kubenswrapper[4911]: I0310 15:10:05.366222 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn99r\" (UniqueName: \"kubernetes.io/projected/85427124-2b71-420b-856e-863aa5c6c5be-kube-api-access-xn99r\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:05 crc kubenswrapper[4911]: I0310 15:10:05.789813 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-qcxxt" event={"ID":"85427124-2b71-420b-856e-863aa5c6c5be","Type":"ContainerDied","Data":"e85d58b8d5f7eef252399dad3f58de2f7d7efd9adf4286b875ac1c700ca96fba"} Mar 10 15:10:05 crc kubenswrapper[4911]: I0310 15:10:05.789865 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e85d58b8d5f7eef252399dad3f58de2f7d7efd9adf4286b875ac1c700ca96fba" Mar 10 15:10:05 crc kubenswrapper[4911]: I0310 15:10:05.789864 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-qcxxt" Mar 10 15:10:06 crc kubenswrapper[4911]: I0310 15:10:06.259464 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552584-x2k99"] Mar 10 15:10:06 crc kubenswrapper[4911]: I0310 15:10:06.268750 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552584-x2k99"] Mar 10 15:10:07 crc kubenswrapper[4911]: I0310 15:10:07.757309 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:10:07 crc kubenswrapper[4911]: I0310 15:10:07.757394 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:10:07 crc kubenswrapper[4911]: I0310 15:10:07.817983 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:10:07 crc kubenswrapper[4911]: I0310 15:10:07.880112 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:10:08 crc kubenswrapper[4911]: I0310 15:10:08.064381 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ckrzf"] Mar 10 15:10:08 crc kubenswrapper[4911]: I0310 15:10:08.211863 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec44bb1-d5c7-422e-9d9f-a1a94d44b259" path="/var/lib/kubelet/pods/7ec44bb1-d5c7-422e-9d9f-a1a94d44b259/volumes" Mar 10 15:10:09 crc kubenswrapper[4911]: I0310 15:10:09.839862 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ckrzf" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerName="registry-server" containerID="cri-o://dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c" gracePeriod=2 Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.823474 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.892986 4911 generic.go:334] "Generic (PLEG): container finished" podID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerID="dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c" exitCode=0 Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.893064 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckrzf" event={"ID":"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b","Type":"ContainerDied","Data":"dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c"} Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.893100 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckrzf" event={"ID":"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b","Type":"ContainerDied","Data":"ccd38e1e38896c30954406f0425188b9a6623a3dd38d2058d4e3b5c7d67491e0"} Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.893129 4911 scope.go:117] "RemoveContainer" containerID="dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.893391 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ckrzf" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.925961 4911 scope.go:117] "RemoveContainer" containerID="f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.954554 4911 scope.go:117] "RemoveContainer" containerID="eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.990833 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-catalog-content\") pod \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.991089 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48xql\" (UniqueName: \"kubernetes.io/projected/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-kube-api-access-48xql\") pod \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.991258 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-utilities\") pod \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\" (UID: \"3b61f4f9-cb87-45db-9fca-3a6d2b5f530b\") " Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.992491 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-utilities" (OuterVolumeSpecName: "utilities") pod "3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" (UID: "3b61f4f9-cb87-45db-9fca-3a6d2b5f530b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.996805 4911 scope.go:117] "RemoveContainer" containerID="dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c" Mar 10 15:10:10 crc kubenswrapper[4911]: E0310 15:10:10.997616 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c\": container with ID starting with dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c not found: ID does not exist" containerID="dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.997668 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c"} err="failed to get container status \"dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c\": rpc error: code = NotFound desc = could not find container \"dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c\": container with ID starting with dde38d2391f8a5c4ddd43b0415786b1642b7b25684e04341018df2130509ce6c not found: ID does not exist" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.997696 4911 scope.go:117] "RemoveContainer" containerID="f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3" Mar 10 15:10:10 crc kubenswrapper[4911]: E0310 15:10:10.998472 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3\": container with ID starting with f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3 not found: ID does not exist" containerID="f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.998504 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3"} err="failed to get container status \"f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3\": rpc error: code = NotFound desc = could not find container \"f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3\": container with ID starting with f7622101de31135334a46809fe27103259be40b639e11085932b9b2bf0be19e3 not found: ID does not exist" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.998525 4911 scope.go:117] "RemoveContainer" containerID="eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec" Mar 10 15:10:10 crc kubenswrapper[4911]: E0310 15:10:10.999207 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec\": container with ID starting with eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec not found: ID does not exist" containerID="eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.999235 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec"} err="failed to get container status \"eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec\": rpc error: code = NotFound desc = could not find container \"eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec\": container with ID starting with eb24434f10610aed04c4296bbc85fb977d2edd0b4c774e2df1a07d32fe0608ec not found: ID does not exist" Mar 10 15:10:10 crc kubenswrapper[4911]: I0310 15:10:10.999821 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-kube-api-access-48xql" (OuterVolumeSpecName: "kube-api-access-48xql") pod "3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" (UID: "3b61f4f9-cb87-45db-9fca-3a6d2b5f530b"). InnerVolumeSpecName "kube-api-access-48xql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:11 crc kubenswrapper[4911]: I0310 15:10:11.054236 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" (UID: "3b61f4f9-cb87-45db-9fca-3a6d2b5f530b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:11 crc kubenswrapper[4911]: I0310 15:10:11.093722 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:11 crc kubenswrapper[4911]: I0310 15:10:11.093775 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48xql\" (UniqueName: \"kubernetes.io/projected/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-kube-api-access-48xql\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:11 crc kubenswrapper[4911]: I0310 15:10:11.093788 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:11 crc kubenswrapper[4911]: I0310 15:10:11.230551 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ckrzf"] Mar 10 15:10:11 crc kubenswrapper[4911]: I0310 15:10:11.239842 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ckrzf"] Mar 10 15:10:12 crc kubenswrapper[4911]: I0310 15:10:12.208124 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" path="/var/lib/kubelet/pods/3b61f4f9-cb87-45db-9fca-3a6d2b5f530b/volumes" Mar 10 15:10:18 crc kubenswrapper[4911]: I0310 15:10:18.520974 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:10:18 crc kubenswrapper[4911]: I0310 15:10:18.521428 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.642489 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f8lcn/must-gather-c7x5z"] Mar 10 15:10:37 crc kubenswrapper[4911]: E0310 15:10:37.643577 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerName="extract-utilities" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.643594 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerName="extract-utilities" Mar 10 15:10:37 crc kubenswrapper[4911]: E0310 15:10:37.643613 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85427124-2b71-420b-856e-863aa5c6c5be" containerName="oc" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.643619 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="85427124-2b71-420b-856e-863aa5c6c5be" containerName="oc" Mar 10 15:10:37 crc kubenswrapper[4911]: E0310 15:10:37.643630 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerName="extract-content" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.643636 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerName="extract-content" Mar 10 15:10:37 crc kubenswrapper[4911]: E0310 15:10:37.643670 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.643675 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.643964 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b61f4f9-cb87-45db-9fca-3a6d2b5f530b" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.643974 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="85427124-2b71-420b-856e-863aa5c6c5be" containerName="oc" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.645139 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.648599 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f8lcn"/"default-dockercfg-bv2jb" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.648679 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f8lcn"/"kube-root-ca.crt" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.658905 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f8lcn"/"openshift-service-ca.crt" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.676230 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f8lcn/must-gather-c7x5z"] Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.841430 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e94b793-17c7-4869-a3c5-089010867649-must-gather-output\") pod \"must-gather-c7x5z\" (UID: \"0e94b793-17c7-4869-a3c5-089010867649\") " pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.841642 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdpn\" (UniqueName: \"kubernetes.io/projected/0e94b793-17c7-4869-a3c5-089010867649-kube-api-access-7vdpn\") pod \"must-gather-c7x5z\" (UID: \"0e94b793-17c7-4869-a3c5-089010867649\") " pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.943383 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdpn\" (UniqueName: \"kubernetes.io/projected/0e94b793-17c7-4869-a3c5-089010867649-kube-api-access-7vdpn\") pod \"must-gather-c7x5z\" (UID: \"0e94b793-17c7-4869-a3c5-089010867649\") " pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.943504 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e94b793-17c7-4869-a3c5-089010867649-must-gather-output\") pod \"must-gather-c7x5z\" (UID: \"0e94b793-17c7-4869-a3c5-089010867649\") " pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.944224 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e94b793-17c7-4869-a3c5-089010867649-must-gather-output\") pod \"must-gather-c7x5z\" (UID: \"0e94b793-17c7-4869-a3c5-089010867649\") " pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.964925 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdpn\" (UniqueName: \"kubernetes.io/projected/0e94b793-17c7-4869-a3c5-089010867649-kube-api-access-7vdpn\") pod \"must-gather-c7x5z\" (UID: \"0e94b793-17c7-4869-a3c5-089010867649\") " pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:10:37 crc kubenswrapper[4911]: I0310 15:10:37.973695 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:10:38 crc kubenswrapper[4911]: I0310 15:10:38.695667 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f8lcn/must-gather-c7x5z"] Mar 10 15:10:39 crc kubenswrapper[4911]: I0310 15:10:39.236029 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" event={"ID":"0e94b793-17c7-4869-a3c5-089010867649","Type":"ContainerStarted","Data":"babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794"} Mar 10 15:10:39 crc kubenswrapper[4911]: I0310 15:10:39.236340 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" event={"ID":"0e94b793-17c7-4869-a3c5-089010867649","Type":"ContainerStarted","Data":"e4fcf8cf3dcfccaab536e70dc5ea098de0f7a8cc34be97d9e971ec1287370a20"} Mar 10 15:10:40 crc kubenswrapper[4911]: I0310 15:10:40.247849 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" event={"ID":"0e94b793-17c7-4869-a3c5-089010867649","Type":"ContainerStarted","Data":"94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba"} Mar 10 15:10:40 crc kubenswrapper[4911]: I0310 15:10:40.266622 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" podStartSLOduration=3.26659431 podStartE2EDuration="3.26659431s" podCreationTimestamp="2026-03-10 15:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:40.266333393 +0000 UTC m=+4144.829853310" watchObservedRunningTime="2026-03-10 15:10:40.26659431 +0000 UTC m=+4144.830114227" Mar 10 15:10:43 crc kubenswrapper[4911]: I0310 15:10:43.684095 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-xf9b9"] Mar 10 15:10:43 crc kubenswrapper[4911]: I0310 15:10:43.686438 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:10:43 crc kubenswrapper[4911]: I0310 15:10:43.789678 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7tbx\" (UniqueName: \"kubernetes.io/projected/b452cba2-1f6c-4d4c-aa2c-7494736a5943-kube-api-access-x7tbx\") pod \"crc-debug-xf9b9\" (UID: \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\") " pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:10:43 crc kubenswrapper[4911]: I0310 15:10:43.790230 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b452cba2-1f6c-4d4c-aa2c-7494736a5943-host\") pod \"crc-debug-xf9b9\" (UID: \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\") " pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:10:43 crc kubenswrapper[4911]: I0310 15:10:43.892118 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7tbx\" (UniqueName: \"kubernetes.io/projected/b452cba2-1f6c-4d4c-aa2c-7494736a5943-kube-api-access-x7tbx\") pod \"crc-debug-xf9b9\" (UID: \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\") " pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:10:43 crc kubenswrapper[4911]: I0310 15:10:43.892491 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b452cba2-1f6c-4d4c-aa2c-7494736a5943-host\") pod \"crc-debug-xf9b9\" (UID: \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\") " pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:10:43 crc kubenswrapper[4911]: I0310 15:10:43.892703 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b452cba2-1f6c-4d4c-aa2c-7494736a5943-host\") pod \"crc-debug-xf9b9\" (UID: \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\") " pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:10:43 crc kubenswrapper[4911]: I0310 15:10:43.918992 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7tbx\" (UniqueName: \"kubernetes.io/projected/b452cba2-1f6c-4d4c-aa2c-7494736a5943-kube-api-access-x7tbx\") pod \"crc-debug-xf9b9\" (UID: \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\") " pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:10:44 crc kubenswrapper[4911]: I0310 15:10:44.007558 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:10:44 crc kubenswrapper[4911]: W0310 15:10:44.056108 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb452cba2_1f6c_4d4c_aa2c_7494736a5943.slice/crio-1e24b9ca3ae32537eff0b759df79ba72e6192267d089819893ce0ecbc4b5ee77 WatchSource:0}: Error finding container 1e24b9ca3ae32537eff0b759df79ba72e6192267d089819893ce0ecbc4b5ee77: Status 404 returned error can't find the container with id 1e24b9ca3ae32537eff0b759df79ba72e6192267d089819893ce0ecbc4b5ee77 Mar 10 15:10:44 crc kubenswrapper[4911]: I0310 15:10:44.303317 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" event={"ID":"b452cba2-1f6c-4d4c-aa2c-7494736a5943","Type":"ContainerStarted","Data":"679f4da8d87d6df31ddcf8234a0eb46da424cd2641ff21b98d49c3b14f092cbd"} Mar 10 15:10:44 crc kubenswrapper[4911]: I0310 15:10:44.303861 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" event={"ID":"b452cba2-1f6c-4d4c-aa2c-7494736a5943","Type":"ContainerStarted","Data":"1e24b9ca3ae32537eff0b759df79ba72e6192267d089819893ce0ecbc4b5ee77"} Mar 10 15:10:44 crc kubenswrapper[4911]: I0310 15:10:44.326415 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" podStartSLOduration=1.3263811429999999 podStartE2EDuration="1.326381143s" podCreationTimestamp="2026-03-10 15:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:44.317642829 +0000 UTC m=+4148.881162766" watchObservedRunningTime="2026-03-10 15:10:44.326381143 +0000 UTC m=+4148.889901060" Mar 10 15:10:48 crc kubenswrapper[4911]: I0310 15:10:48.521465 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:10:48 crc kubenswrapper[4911]: I0310 15:10:48.522239 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:10:59 crc kubenswrapper[4911]: I0310 15:10:59.334107 4911 scope.go:117] "RemoveContainer" containerID="d70a29bd4a46f8367eaa919ac5de6960ba848defe1ca97faa72de1bc36ae127d" Mar 10 15:11:18 crc kubenswrapper[4911]: I0310 15:11:18.521322 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:11:18 crc kubenswrapper[4911]: I0310 15:11:18.522232 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4911]: I0310 15:11:18.522297 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 15:11:18 crc kubenswrapper[4911]: I0310 15:11:18.523260 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07403a5bc42ca34f5ff0c49121e95c611f228b955cde52cc7ca9346c9d02ab86"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:11:18 crc kubenswrapper[4911]: I0310 15:11:18.523321 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://07403a5bc42ca34f5ff0c49121e95c611f228b955cde52cc7ca9346c9d02ab86" gracePeriod=600 Mar 10 15:11:19 crc kubenswrapper[4911]: I0310 15:11:19.643195 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="07403a5bc42ca34f5ff0c49121e95c611f228b955cde52cc7ca9346c9d02ab86" exitCode=0 Mar 10 15:11:19 crc kubenswrapper[4911]: I0310 15:11:19.643299 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"07403a5bc42ca34f5ff0c49121e95c611f228b955cde52cc7ca9346c9d02ab86"} Mar 10 15:11:19 crc kubenswrapper[4911]: I0310 15:11:19.645143 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5"} Mar 10 15:11:19 crc kubenswrapper[4911]: I0310 15:11:19.645306 4911 scope.go:117] "RemoveContainer" containerID="a12843c07b7f3b24a0a7fe1f89fc54a70b68209f9dbed37cb5d083f7bf1fc848" Mar 10 15:11:21 crc kubenswrapper[4911]: I0310 15:11:21.670277 4911 generic.go:334] "Generic (PLEG): container finished" podID="b452cba2-1f6c-4d4c-aa2c-7494736a5943" containerID="679f4da8d87d6df31ddcf8234a0eb46da424cd2641ff21b98d49c3b14f092cbd" exitCode=0 Mar 10 15:11:21 crc kubenswrapper[4911]: I0310 15:11:21.670379 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" event={"ID":"b452cba2-1f6c-4d4c-aa2c-7494736a5943","Type":"ContainerDied","Data":"679f4da8d87d6df31ddcf8234a0eb46da424cd2641ff21b98d49c3b14f092cbd"} Mar 10 15:11:22 crc kubenswrapper[4911]: I0310 15:11:22.802694 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:11:22 crc kubenswrapper[4911]: I0310 15:11:22.850530 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-xf9b9"] Mar 10 15:11:22 crc kubenswrapper[4911]: I0310 15:11:22.861528 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-xf9b9"] Mar 10 15:11:22 crc kubenswrapper[4911]: I0310 15:11:22.925526 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7tbx\" (UniqueName: \"kubernetes.io/projected/b452cba2-1f6c-4d4c-aa2c-7494736a5943-kube-api-access-x7tbx\") pod \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\" (UID: \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\") " Mar 10 15:11:22 crc kubenswrapper[4911]: I0310 15:11:22.925752 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b452cba2-1f6c-4d4c-aa2c-7494736a5943-host\") pod \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\" (UID: \"b452cba2-1f6c-4d4c-aa2c-7494736a5943\") " Mar 10 15:11:22 crc kubenswrapper[4911]: I0310 15:11:22.926136 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b452cba2-1f6c-4d4c-aa2c-7494736a5943-host" (OuterVolumeSpecName: "host") pod "b452cba2-1f6c-4d4c-aa2c-7494736a5943" (UID: "b452cba2-1f6c-4d4c-aa2c-7494736a5943"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:22 crc kubenswrapper[4911]: I0310 15:11:22.926886 4911 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b452cba2-1f6c-4d4c-aa2c-7494736a5943-host\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:23 crc kubenswrapper[4911]: I0310 15:11:23.425129 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b452cba2-1f6c-4d4c-aa2c-7494736a5943-kube-api-access-x7tbx" (OuterVolumeSpecName: "kube-api-access-x7tbx") pod "b452cba2-1f6c-4d4c-aa2c-7494736a5943" (UID: "b452cba2-1f6c-4d4c-aa2c-7494736a5943"). InnerVolumeSpecName "kube-api-access-x7tbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:23 crc kubenswrapper[4911]: I0310 15:11:23.440823 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7tbx\" (UniqueName: \"kubernetes.io/projected/b452cba2-1f6c-4d4c-aa2c-7494736a5943-kube-api-access-x7tbx\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:23 crc kubenswrapper[4911]: I0310 15:11:23.691501 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e24b9ca3ae32537eff0b759df79ba72e6192267d089819893ce0ecbc4b5ee77" Mar 10 15:11:23 crc kubenswrapper[4911]: I0310 15:11:23.691561 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-xf9b9" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.064405 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-6zcrd"] Mar 10 15:11:24 crc kubenswrapper[4911]: E0310 15:11:24.065947 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b452cba2-1f6c-4d4c-aa2c-7494736a5943" containerName="container-00" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.065967 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b452cba2-1f6c-4d4c-aa2c-7494736a5943" containerName="container-00" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.066176 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b452cba2-1f6c-4d4c-aa2c-7494736a5943" containerName="container-00" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.067050 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.154273 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqc7l\" (UniqueName: \"kubernetes.io/projected/b197a866-7a41-428a-977e-47fcafd66251-kube-api-access-fqc7l\") pod \"crc-debug-6zcrd\" (UID: \"b197a866-7a41-428a-977e-47fcafd66251\") " pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.154424 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b197a866-7a41-428a-977e-47fcafd66251-host\") pod \"crc-debug-6zcrd\" (UID: \"b197a866-7a41-428a-977e-47fcafd66251\") " pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.205600 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b452cba2-1f6c-4d4c-aa2c-7494736a5943" path="/var/lib/kubelet/pods/b452cba2-1f6c-4d4c-aa2c-7494736a5943/volumes" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.255981 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqc7l\" (UniqueName: \"kubernetes.io/projected/b197a866-7a41-428a-977e-47fcafd66251-kube-api-access-fqc7l\") pod \"crc-debug-6zcrd\" (UID: \"b197a866-7a41-428a-977e-47fcafd66251\") " pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.256079 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b197a866-7a41-428a-977e-47fcafd66251-host\") pod \"crc-debug-6zcrd\" (UID: \"b197a866-7a41-428a-977e-47fcafd66251\") " pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.256235 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b197a866-7a41-428a-977e-47fcafd66251-host\") pod \"crc-debug-6zcrd\" (UID: \"b197a866-7a41-428a-977e-47fcafd66251\") " pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.279771 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqc7l\" (UniqueName: \"kubernetes.io/projected/b197a866-7a41-428a-977e-47fcafd66251-kube-api-access-fqc7l\") pod \"crc-debug-6zcrd\" (UID: \"b197a866-7a41-428a-977e-47fcafd66251\") " pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.383428 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.701950 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" event={"ID":"b197a866-7a41-428a-977e-47fcafd66251","Type":"ContainerStarted","Data":"5a5fb87dff2ce11c5b49f2aed8f851990233d3e47ba054873546618adb95f9c4"} Mar 10 15:11:24 crc kubenswrapper[4911]: I0310 15:11:24.722501 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" podStartSLOduration=0.722481152 podStartE2EDuration="722.481152ms" podCreationTimestamp="2026-03-10 15:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:24.714309893 +0000 UTC m=+4189.277829810" watchObservedRunningTime="2026-03-10 15:11:24.722481152 +0000 UTC m=+4189.286001069" Mar 10 15:11:25 crc kubenswrapper[4911]: I0310 15:11:25.714326 4911 generic.go:334] "Generic (PLEG): container finished" podID="b197a866-7a41-428a-977e-47fcafd66251" containerID="1a47eec5a53819fea44944bfce4685f58f8a8fb0398387dac6dd90bb6660c866" exitCode=0 Mar 10 15:11:25 crc kubenswrapper[4911]: I0310 15:11:25.714422 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" event={"ID":"b197a866-7a41-428a-977e-47fcafd66251","Type":"ContainerDied","Data":"1a47eec5a53819fea44944bfce4685f58f8a8fb0398387dac6dd90bb6660c866"} Mar 10 15:11:26 crc kubenswrapper[4911]: I0310 15:11:26.841822 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:26 crc kubenswrapper[4911]: I0310 15:11:26.874549 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-6zcrd"] Mar 10 15:11:26 crc kubenswrapper[4911]: I0310 15:11:26.884881 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-6zcrd"] Mar 10 15:11:26 crc kubenswrapper[4911]: I0310 15:11:26.909356 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqc7l\" (UniqueName: \"kubernetes.io/projected/b197a866-7a41-428a-977e-47fcafd66251-kube-api-access-fqc7l\") pod \"b197a866-7a41-428a-977e-47fcafd66251\" (UID: \"b197a866-7a41-428a-977e-47fcafd66251\") " Mar 10 15:11:26 crc kubenswrapper[4911]: I0310 15:11:26.909412 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b197a866-7a41-428a-977e-47fcafd66251-host\") pod \"b197a866-7a41-428a-977e-47fcafd66251\" (UID: \"b197a866-7a41-428a-977e-47fcafd66251\") " Mar 10 15:11:26 crc kubenswrapper[4911]: I0310 15:11:26.910092 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b197a866-7a41-428a-977e-47fcafd66251-host" (OuterVolumeSpecName: "host") pod "b197a866-7a41-428a-977e-47fcafd66251" (UID: "b197a866-7a41-428a-977e-47fcafd66251"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:26 crc kubenswrapper[4911]: I0310 15:11:26.910289 4911 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b197a866-7a41-428a-977e-47fcafd66251-host\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:26 crc kubenswrapper[4911]: I0310 15:11:26.916964 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b197a866-7a41-428a-977e-47fcafd66251-kube-api-access-fqc7l" (OuterVolumeSpecName: "kube-api-access-fqc7l") pod "b197a866-7a41-428a-977e-47fcafd66251" (UID: "b197a866-7a41-428a-977e-47fcafd66251"). InnerVolumeSpecName "kube-api-access-fqc7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:27 crc kubenswrapper[4911]: I0310 15:11:27.012398 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqc7l\" (UniqueName: \"kubernetes.io/projected/b197a866-7a41-428a-977e-47fcafd66251-kube-api-access-fqc7l\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:27 crc kubenswrapper[4911]: I0310 15:11:27.735659 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a5fb87dff2ce11c5b49f2aed8f851990233d3e47ba054873546618adb95f9c4" Mar 10 15:11:27 crc kubenswrapper[4911]: I0310 15:11:27.735747 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-6zcrd" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.111740 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-rj7zt"] Mar 10 15:11:28 crc kubenswrapper[4911]: E0310 15:11:28.112325 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b197a866-7a41-428a-977e-47fcafd66251" containerName="container-00" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.112340 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b197a866-7a41-428a-977e-47fcafd66251" containerName="container-00" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.112558 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b197a866-7a41-428a-977e-47fcafd66251" containerName="container-00" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.113481 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.207469 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b197a866-7a41-428a-977e-47fcafd66251" path="/var/lib/kubelet/pods/b197a866-7a41-428a-977e-47fcafd66251/volumes" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.237576 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdggf\" (UniqueName: \"kubernetes.io/projected/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-kube-api-access-zdggf\") pod \"crc-debug-rj7zt\" (UID: \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\") " pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.237672 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-host\") pod \"crc-debug-rj7zt\" (UID: \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\") " pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.340466 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdggf\" (UniqueName: \"kubernetes.io/projected/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-kube-api-access-zdggf\") pod \"crc-debug-rj7zt\" (UID: \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\") " pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.340983 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-host\") pod \"crc-debug-rj7zt\" (UID: \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\") " pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.341176 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-host\") pod \"crc-debug-rj7zt\" (UID: \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\") " pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.391800 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdggf\" (UniqueName: \"kubernetes.io/projected/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-kube-api-access-zdggf\") pod \"crc-debug-rj7zt\" (UID: \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\") " pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.431532 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.747990 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" event={"ID":"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0","Type":"ContainerStarted","Data":"5296459874ddb03ed0eb0d4233e99aadd37e2f26b76fc31d667b925d68a9e332"} Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.748676 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" event={"ID":"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0","Type":"ContainerStarted","Data":"9edce1f0621913f1d32f8eb2f7dcd08b04f64e5a93c853d6071f2ff6bb801791"} Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.800899 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-rj7zt"] Mar 10 15:11:28 crc kubenswrapper[4911]: I0310 15:11:28.815075 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f8lcn/crc-debug-rj7zt"] Mar 10 15:11:29 crc kubenswrapper[4911]: I0310 15:11:29.762438 4911 generic.go:334] "Generic (PLEG): container finished" podID="5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0" containerID="5296459874ddb03ed0eb0d4233e99aadd37e2f26b76fc31d667b925d68a9e332" exitCode=0 Mar 10 15:11:29 crc kubenswrapper[4911]: I0310 15:11:29.890506 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:11:29 crc kubenswrapper[4911]: I0310 15:11:29.978656 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-host\") pod \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\" (UID: \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\") " Mar 10 15:11:29 crc kubenswrapper[4911]: I0310 15:11:29.978820 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-host" (OuterVolumeSpecName: "host") pod "5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0" (UID: "5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:29 crc kubenswrapper[4911]: I0310 15:11:29.979086 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdggf\" (UniqueName: \"kubernetes.io/projected/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-kube-api-access-zdggf\") pod \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\" (UID: \"5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0\") " Mar 10 15:11:29 crc kubenswrapper[4911]: I0310 15:11:29.980171 4911 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-host\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:29 crc kubenswrapper[4911]: I0310 15:11:29.990715 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-kube-api-access-zdggf" (OuterVolumeSpecName: "kube-api-access-zdggf") pod "5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0" (UID: "5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0"). InnerVolumeSpecName "kube-api-access-zdggf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:30 crc kubenswrapper[4911]: I0310 15:11:30.082295 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdggf\" (UniqueName: \"kubernetes.io/projected/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0-kube-api-access-zdggf\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:30 crc kubenswrapper[4911]: I0310 15:11:30.205140 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0" path="/var/lib/kubelet/pods/5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0/volumes" Mar 10 15:11:30 crc kubenswrapper[4911]: I0310 15:11:30.773004 4911 scope.go:117] "RemoveContainer" containerID="5296459874ddb03ed0eb0d4233e99aadd37e2f26b76fc31d667b925d68a9e332" Mar 10 15:11:30 crc kubenswrapper[4911]: I0310 15:11:30.773185 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/crc-debug-rj7zt" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.162537 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552592-vcnvx"] Mar 10 15:12:00 crc kubenswrapper[4911]: E0310 15:12:00.163807 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0" containerName="container-00" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.163829 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0" containerName="container-00" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.164152 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac4b3c5-1f15-4236-baa7-1ae3e09e8ef0" containerName="container-00" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.165081 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-vcnvx" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.167193 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.167260 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.168273 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.171632 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-vcnvx"] Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.275019 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7j2q\" (UniqueName: \"kubernetes.io/projected/e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c-kube-api-access-n7j2q\") pod \"auto-csr-approver-29552592-vcnvx\" (UID: \"e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c\") " pod="openshift-infra/auto-csr-approver-29552592-vcnvx" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.377000 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7j2q\" (UniqueName: \"kubernetes.io/projected/e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c-kube-api-access-n7j2q\") pod \"auto-csr-approver-29552592-vcnvx\" (UID: \"e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c\") " pod="openshift-infra/auto-csr-approver-29552592-vcnvx" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.396020 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7j2q\" (UniqueName: \"kubernetes.io/projected/e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c-kube-api-access-n7j2q\") pod \"auto-csr-approver-29552592-vcnvx\" (UID: \"e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c\") " pod="openshift-infra/auto-csr-approver-29552592-vcnvx" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.486185 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-vcnvx" Mar 10 15:12:00 crc kubenswrapper[4911]: I0310 15:12:00.942001 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-vcnvx"] Mar 10 15:12:01 crc kubenswrapper[4911]: I0310 15:12:01.130868 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-vcnvx" event={"ID":"e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c","Type":"ContainerStarted","Data":"d5b8275b65547cdf2291e5123312a6b45ab78dcd8917e76d4ff6d3535b6fccc6"} Mar 10 15:12:03 crc kubenswrapper[4911]: I0310 15:12:03.175787 4911 generic.go:334] "Generic (PLEG): container finished" podID="e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c" containerID="3c349e829765ed9c371f7c3b7400deb2abbe6a8d7ceee9148f2bd4df79aa3c38" exitCode=0 Mar 10 15:12:03 crc kubenswrapper[4911]: I0310 15:12:03.176270 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-vcnvx" event={"ID":"e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c","Type":"ContainerDied","Data":"3c349e829765ed9c371f7c3b7400deb2abbe6a8d7ceee9148f2bd4df79aa3c38"} Mar 10 15:12:03 crc kubenswrapper[4911]: I0310 15:12:03.578516 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6fb47b4698-gx22c_2689d664-9bf8-4c5b-8c53-353286854071/barbican-api/0.log" Mar 10 15:12:03 crc kubenswrapper[4911]: I0310 15:12:03.770639 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6fb47b4698-gx22c_2689d664-9bf8-4c5b-8c53-353286854071/barbican-api-log/0.log" Mar 10 15:12:03 crc kubenswrapper[4911]: I0310 15:12:03.939433 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8f48b4f88-jjg7s_d66c76f8-6b9a-40d3-b5fc-d2d5790928f6/barbican-keystone-listener/0.log" Mar 10 15:12:03 crc kubenswrapper[4911]: I0310 15:12:03.973030 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8f48b4f88-jjg7s_d66c76f8-6b9a-40d3-b5fc-d2d5790928f6/barbican-keystone-listener-log/0.log" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.106439 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8599db9-k9r6m_c049ccee-c503-43b1-b263-c6ee453e93e0/barbican-worker/0.log" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.164576 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8599db9-k9r6m_c049ccee-c503-43b1-b263-c6ee453e93e0/barbican-worker-log/0.log" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.313237 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8cfv8_98224edf-8b07-4753-87d9-4f6060957d74/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.624592 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-vcnvx" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.713787 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7j2q\" (UniqueName: \"kubernetes.io/projected/e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c-kube-api-access-n7j2q\") pod \"e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c\" (UID: \"e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c\") " Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.721841 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c-kube-api-access-n7j2q" (OuterVolumeSpecName: "kube-api-access-n7j2q") pod "e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c" (UID: "e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c"). InnerVolumeSpecName "kube-api-access-n7j2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.816641 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7j2q\" (UniqueName: \"kubernetes.io/projected/e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c-kube-api-access-n7j2q\") on node \"crc\" DevicePath \"\"" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.868850 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_92bb8486-3729-4b5d-8f09-b99baf382c52/proxy-httpd/0.log" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.882314 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_92bb8486-3729-4b5d-8f09-b99baf382c52/ceilometer-central-agent/0.log" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.892292 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_92bb8486-3729-4b5d-8f09-b99baf382c52/ceilometer-notification-agent/0.log" Mar 10 15:12:04 crc kubenswrapper[4911]: I0310 15:12:04.933782 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_92bb8486-3729-4b5d-8f09-b99baf382c52/sg-core/0.log" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.099116 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6e666af1-a2f4-4aa0-95c6-f8568be705d8/cinder-api-log/0.log" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.193978 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6e666af1-a2f4-4aa0-95c6-f8568be705d8/cinder-api/0.log" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.197713 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-vcnvx" event={"ID":"e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c","Type":"ContainerDied","Data":"d5b8275b65547cdf2291e5123312a6b45ab78dcd8917e76d4ff6d3535b6fccc6"} Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.197768 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5b8275b65547cdf2291e5123312a6b45ab78dcd8917e76d4ff6d3535b6fccc6" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.197807 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-vcnvx" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.364572 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b634ed72-d485-42b9-a382-24974c25ab42/cinder-scheduler/0.log" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.438958 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b634ed72-d485-42b9-a382-24974c25ab42/probe/0.log" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.550662 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fstmx_7ea3ab89-1a92-47f9-85a5-3df48990343b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.697809 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ll9ct_db919e87-a5ed-4241-a4cf-7f5ddfe9e0a2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.734625 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552586-8fv8m"] Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.744685 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552586-8fv8m"] Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.817810 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q56d9_ec6c0ebc-e82c-4981-a32c-8ee98d9496ec/init/0.log" Mar 10 15:12:05 crc kubenswrapper[4911]: I0310 15:12:05.992893 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q56d9_ec6c0ebc-e82c-4981-a32c-8ee98d9496ec/init/0.log" Mar 10 15:12:06 crc kubenswrapper[4911]: I0310 15:12:06.011434 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q56d9_ec6c0ebc-e82c-4981-a32c-8ee98d9496ec/dnsmasq-dns/0.log" Mar 10 15:12:06 crc kubenswrapper[4911]: I0310 15:12:06.205761 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11552a5-7768-424e-a30b-49e247d09194" path="/var/lib/kubelet/pods/d11552a5-7768-424e-a30b-49e247d09194/volumes" Mar 10 15:12:06 crc kubenswrapper[4911]: I0310 15:12:06.578433 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8qkjb_37e4aed4-039e-4b2b-89d7-65c43eb8f688/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:06 crc kubenswrapper[4911]: I0310 15:12:06.647856 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_249c149f-3423-4163-8358-b36f6d55c6f3/glance-httpd/0.log" Mar 10 15:12:06 crc kubenswrapper[4911]: I0310 15:12:06.754903 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_249c149f-3423-4163-8358-b36f6d55c6f3/glance-log/0.log" Mar 10 15:12:06 crc kubenswrapper[4911]: I0310 15:12:06.890631 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_001c3353-3ca1-444c-a741-b2447e3ca566/glance-httpd/0.log" Mar 10 15:12:06 crc kubenswrapper[4911]: I0310 15:12:06.905582 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_001c3353-3ca1-444c-a741-b2447e3ca566/glance-log/0.log" Mar 10 15:12:07 crc kubenswrapper[4911]: I0310 15:12:07.243565 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54d884b5d4-lsz26_6be9e57d-52b9-4de2-9201-1b85feda712c/horizon/0.log" Mar 10 15:12:07 crc kubenswrapper[4911]: I0310 15:12:07.290034 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zdcbr_5d1a5e0b-536c-4d5f-9c65-595361611fcd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:07 crc kubenswrapper[4911]: I0310 15:12:07.480462 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-p9w8h_6c29812e-9268-4508-aef7-cb43fe278c8d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:07 crc kubenswrapper[4911]: I0310 15:12:07.506564 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54d884b5d4-lsz26_6be9e57d-52b9-4de2-9201-1b85feda712c/horizon-log/0.log" Mar 10 15:12:07 crc kubenswrapper[4911]: I0310 15:12:07.763442 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29552581-nh5lq_7a04a21f-13ba-40c3-89ea-a4e87b1fec25/keystone-cron/0.log" Mar 10 15:12:07 crc kubenswrapper[4911]: I0310 15:12:07.880528 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76d846bbc6-4wr5p_3370ba4c-d284-4d51-8b2d-d1da50950def/keystone-api/0.log" Mar 10 15:12:07 crc kubenswrapper[4911]: I0310 15:12:07.908773 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cd645aa4-53be-4ede-a00b-e294626fc333/kube-state-metrics/0.log" Mar 10 15:12:08 crc kubenswrapper[4911]: I0310 15:12:08.059488 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mrq75_2d1eaf3f-414a-426a-8dbf-15825613d50a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:08 crc kubenswrapper[4911]: I0310 15:12:08.698243 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d48f5c7d5-2xxzq_5be3e6b2-8478-41bf-9fb1-09e053e8b5ac/neutron-httpd/0.log" Mar 10 15:12:08 crc kubenswrapper[4911]: I0310 15:12:08.823253 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d48f5c7d5-2xxzq_5be3e6b2-8478-41bf-9fb1-09e053e8b5ac/neutron-api/0.log" Mar 10 15:12:08 crc kubenswrapper[4911]: I0310 15:12:08.923360 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-584nf_9fffeac8-b15e-48c2-a04e-7f6b6b28e142/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:09 crc kubenswrapper[4911]: I0310 15:12:09.539783 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_17b26bd6-b922-485e-9655-001a52e6731c/nova-api-log/0.log" Mar 10 15:12:09 crc kubenswrapper[4911]: I0310 15:12:09.627950 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_af0163a2-67bb-4bff-b4c7-c525f764e808/nova-cell0-conductor-conductor/0.log" Mar 10 15:12:09 crc kubenswrapper[4911]: I0310 15:12:09.931152 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d81b0082-b7ae-4d38-8dd2-5d20459aa493/nova-cell1-conductor-conductor/0.log" Mar 10 15:12:10 crc kubenswrapper[4911]: I0310 15:12:10.060038 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3b68c0d5-c7e3-4b1d-b9a0-337f56619c45/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 15:12:10 crc kubenswrapper[4911]: I0310 15:12:10.066213 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_17b26bd6-b922-485e-9655-001a52e6731c/nova-api-api/0.log" Mar 10 15:12:10 crc kubenswrapper[4911]: I0310 15:12:10.261613 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dksjc_43b7e07c-895e-46e1-9863-4dc4845a72ea/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:10 crc kubenswrapper[4911]: I0310 15:12:10.462931 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_607959a6-b845-45ea-b09a-966237b6dd1a/nova-metadata-log/0.log" Mar 10 15:12:10 crc kubenswrapper[4911]: I0310 15:12:10.787969 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6e7efec5-8494-472d-b149-a6aeed4810b2/mysql-bootstrap/0.log" Mar 10 15:12:10 crc kubenswrapper[4911]: I0310 15:12:10.863966 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f3e72e32-585d-4c71-9788-fd40c839f2ed/nova-scheduler-scheduler/0.log" Mar 10 15:12:10 crc kubenswrapper[4911]: I0310 15:12:10.998158 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6e7efec5-8494-472d-b149-a6aeed4810b2/mysql-bootstrap/0.log" Mar 10 15:12:11 crc kubenswrapper[4911]: I0310 15:12:11.007566 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6e7efec5-8494-472d-b149-a6aeed4810b2/galera/0.log" Mar 10 15:12:11 crc kubenswrapper[4911]: I0310 15:12:11.270654 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5ff8ebc9-ea10-4e9c-be23-96608817ed84/mysql-bootstrap/0.log" Mar 10 15:12:11 crc kubenswrapper[4911]: I0310 15:12:11.490630 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5ff8ebc9-ea10-4e9c-be23-96608817ed84/galera/0.log" Mar 10 15:12:11 crc kubenswrapper[4911]: I0310 15:12:11.529956 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5ff8ebc9-ea10-4e9c-be23-96608817ed84/mysql-bootstrap/0.log" Mar 10 15:12:11 crc kubenswrapper[4911]: I0310 15:12:11.733521 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_350c17be-173a-480f-bb79-314043291d4d/openstackclient/0.log" Mar 10 15:12:11 crc kubenswrapper[4911]: I0310 15:12:11.784554 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9vssn_e43fdd12-0361-428c-8318-d1cec1c95399/ovn-controller/0.log" Mar 10 15:12:11 crc kubenswrapper[4911]: I0310 15:12:11.889369 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_607959a6-b845-45ea-b09a-966237b6dd1a/nova-metadata-metadata/0.log" Mar 10 15:12:12 crc kubenswrapper[4911]: I0310 15:12:12.127166 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6m8bl_1c9db65b-8d56-4b07-86cd-dc73f3aa87fe/openstack-network-exporter/0.log" Mar 10 15:12:12 crc kubenswrapper[4911]: I0310 15:12:12.390665 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5hbsq_6941e0ca-8689-452e-82e4-d233cbbd45ec/ovsdb-server-init/0.log" Mar 10 15:12:12 crc kubenswrapper[4911]: I0310 15:12:12.530401 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5hbsq_6941e0ca-8689-452e-82e4-d233cbbd45ec/ovs-vswitchd/0.log" Mar 10 15:12:12 crc kubenswrapper[4911]: I0310 15:12:12.567181 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5hbsq_6941e0ca-8689-452e-82e4-d233cbbd45ec/ovsdb-server-init/0.log" Mar 10 15:12:12 crc kubenswrapper[4911]: I0310 15:12:12.667257 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5hbsq_6941e0ca-8689-452e-82e4-d233cbbd45ec/ovsdb-server/0.log" Mar 10 15:12:12 crc kubenswrapper[4911]: I0310 15:12:12.827928 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-prhkd_3d4d304b-5bae-475d-9d99-da422d354bb0/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:12 crc kubenswrapper[4911]: I0310 15:12:12.944252 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_43711b1d-4425-4081-ad98-ecee8b8c73c7/ovn-northd/0.log" Mar 10 15:12:13 crc kubenswrapper[4911]: I0310 15:12:13.001358 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_43711b1d-4425-4081-ad98-ecee8b8c73c7/openstack-network-exporter/0.log" Mar 10 15:12:13 crc kubenswrapper[4911]: I0310 15:12:13.149050 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a8355c9-0644-458c-9df7-bbbfd01fc249/openstack-network-exporter/0.log" Mar 10 15:12:13 crc kubenswrapper[4911]: I0310 15:12:13.272603 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a8355c9-0644-458c-9df7-bbbfd01fc249/ovsdbserver-nb/0.log" Mar 10 15:12:13 crc kubenswrapper[4911]: I0310 15:12:13.350512 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48ab1a9d-fcce-4cdf-8e73-cae7562b4269/openstack-network-exporter/0.log" Mar 10 15:12:13 crc kubenswrapper[4911]: I0310 15:12:13.422295 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48ab1a9d-fcce-4cdf-8e73-cae7562b4269/ovsdbserver-sb/0.log" Mar 10 15:12:13 crc kubenswrapper[4911]: I0310 15:12:13.628827 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-577d67f998-s8wh9_6f385703-b741-42ec-a63e-ec5a371859de/placement-api/0.log" Mar 10 15:12:13 crc kubenswrapper[4911]: I0310 15:12:13.745508 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-577d67f998-s8wh9_6f385703-b741-42ec-a63e-ec5a371859de/placement-log/0.log" Mar 10 15:12:13 crc kubenswrapper[4911]: I0310 15:12:13.819961 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0480ed86-7666-490a-9cd0-78a5ba05dac7/setup-container/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.110043 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0480ed86-7666-490a-9cd0-78a5ba05dac7/rabbitmq/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.125250 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0480ed86-7666-490a-9cd0-78a5ba05dac7/setup-container/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.166819 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_85cb7ff2-e47f-46ad-a30d-6442c0fde95f/setup-container/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.365577 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_85cb7ff2-e47f-46ad-a30d-6442c0fde95f/setup-container/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.380857 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_85cb7ff2-e47f-46ad-a30d-6442c0fde95f/rabbitmq/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.439509 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-7pd8t_7b847208-7241-442f-8b60-b153986d1ea3/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.624235 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-rc5zm_05cc5850-302b-49b9-a8d3-62654314670a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.660013 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-q4dq8_3f912ff3-8e6b-4757-8708-865cb96e132e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:14 crc kubenswrapper[4911]: I0310 15:12:14.963891 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wcn6n_8c81ff0d-aedd-419d-b159-b2e36b895839/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:15 crc kubenswrapper[4911]: I0310 15:12:15.017753 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-qhk66_02e6e27d-b387-4fa4-993a-525b581993c1/ssh-known-hosts-edpm-deployment/0.log" Mar 10 15:12:15 crc kubenswrapper[4911]: I0310 15:12:15.372634 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-69f9f96d6c-plmfc_7f8852b3-f34b-4a37-b546-b7bd6b595203/proxy-httpd/0.log" Mar 10 15:12:15 crc kubenswrapper[4911]: I0310 15:12:15.386419 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-69f9f96d6c-plmfc_7f8852b3-f34b-4a37-b546-b7bd6b595203/proxy-server/0.log" Mar 10 15:12:15 crc kubenswrapper[4911]: I0310 15:12:15.637584 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hkqgc_4cdd5d89-ce0c-4bce-90bf-f4a7c4e8c46a/swift-ring-rebalance/0.log" Mar 10 15:12:15 crc kubenswrapper[4911]: I0310 15:12:15.747568 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/account-auditor/0.log" Mar 10 15:12:15 crc kubenswrapper[4911]: I0310 15:12:15.790599 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/account-reaper/0.log" Mar 10 15:12:15 crc kubenswrapper[4911]: I0310 15:12:15.913585 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/account-replicator/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.012295 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/account-server/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.037808 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/container-auditor/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.133953 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/container-replicator/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.161524 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/container-server/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.252623 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/container-updater/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.305288 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-auditor/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.436207 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-expirer/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.464697 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-replicator/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.533240 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-server/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.552183 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/object-updater/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.650532 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/rsync/0.log" Mar 10 15:12:16 crc kubenswrapper[4911]: I0310 15:12:16.710507 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ca33eab-721d-4858-8e23-9ffc6371926f/swift-recon-cron/0.log" Mar 10 15:12:17 crc kubenswrapper[4911]: I0310 15:12:17.572142 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c6a78318-420c-43fe-98f3-9306e18ee2d4/tempest-tests-tempest-tests-runner/0.log" Mar 10 15:12:17 crc kubenswrapper[4911]: I0310 15:12:17.675811 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-vqpm6_1fe4191c-9c8e-4d7c-9323-0fce2c397878/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:17 crc kubenswrapper[4911]: I0310 15:12:17.878979 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_736e35e8-db53-456b-a374-50f70159f967/test-operator-logs-container/0.log" Mar 10 15:12:17 crc kubenswrapper[4911]: I0310 15:12:17.948476 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-nttsn_94cde38e-e826-4cad-9f7a-55e42ec4964a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 15:12:27 crc kubenswrapper[4911]: I0310 15:12:27.348664 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_45599797-9a4e-428b-8f95-39b6db7bd84e/memcached/0.log" Mar 10 15:12:48 crc kubenswrapper[4911]: I0310 15:12:48.403925 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/util/0.log" Mar 10 15:12:48 crc kubenswrapper[4911]: I0310 15:12:48.614084 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/util/0.log" Mar 10 15:12:48 crc kubenswrapper[4911]: I0310 15:12:48.615051 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/pull/0.log" Mar 10 15:12:48 crc kubenswrapper[4911]: I0310 15:12:48.664274 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/pull/0.log" Mar 10 15:12:49 crc kubenswrapper[4911]: I0310 15:12:49.088446 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/util/0.log" Mar 10 15:12:49 crc kubenswrapper[4911]: I0310 15:12:49.090042 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/pull/0.log" Mar 10 15:12:49 crc kubenswrapper[4911]: I0310 15:12:49.161410 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_93e2bbaf2f206dcc378ba2e94e0137bdc572c4ed69a9cf16169b8ec6danbvmv_491a73fc-185e-46a0-815e-b1ec70061fc5/extract/0.log" Mar 10 15:12:50 crc kubenswrapper[4911]: I0310 15:12:50.064391 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-t2lgw_c8490ff5-eaf3-4d9e-b9f9-7ad3ae159298/manager/0.log" Mar 10 15:12:50 crc kubenswrapper[4911]: I0310 15:12:50.442881 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-z5pz7_6a0bd4c9-4420-48be-9637-67ea2b5c89d1/manager/0.log" Mar 10 15:12:50 crc kubenswrapper[4911]: I0310 15:12:50.612222 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-8mqrm_14dd9547-ff92-4cb4-a055-e41fd390e90e/manager/0.log" Mar 10 15:12:50 crc kubenswrapper[4911]: I0310 15:12:50.862620 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-72kfl_e2b89cc3-8229-4401-b8e9-9a32bffb0f57/manager/0.log" Mar 10 15:12:51 crc kubenswrapper[4911]: I0310 15:12:51.431141 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-t2qf9_06a5238b-e7e1-49a5-9bb8-5f6162183a13/manager/0.log" Mar 10 15:12:51 crc kubenswrapper[4911]: I0310 15:12:51.605599 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-jjsfs_c5336054-5038-40f7-8512-9fe34269f6cd/manager/0.log" Mar 10 15:12:51 crc kubenswrapper[4911]: I0310 15:12:51.892053 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-hngxq_a937a94a-14cb-4319-9147-d0ac60c5cc6a/manager/0.log" Mar 10 15:12:52 crc kubenswrapper[4911]: I0310 15:12:52.032242 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-xw7bn_998d9bc8-11c1-4967-b3d9-1c823d6c41d6/manager/0.log" Mar 10 15:12:52 crc kubenswrapper[4911]: I0310 15:12:52.185086 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-8mq9k_90f49412-d2c3-46ba-9591-5adee9624834/manager/0.log" Mar 10 15:12:52 crc kubenswrapper[4911]: I0310 15:12:52.395107 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-7pcfv_53d2a376-b957-4875-8bfe-42d5dbc0a634/manager/0.log" Mar 10 15:12:52 crc kubenswrapper[4911]: I0310 15:12:52.612779 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-5v9fq_7c70b1a5-051f-43b5-80a3-1b462b9a50f8/manager/0.log" Mar 10 15:12:52 crc kubenswrapper[4911]: I0310 15:12:52.852716 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-57mls_c5080d31-7711-4e4a-9902-4843929a16e9/manager/0.log" Mar 10 15:12:52 crc kubenswrapper[4911]: I0310 15:12:52.853592 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-cqdch_5cf94e3e-325d-4364-bf70-c479683b2be6/manager/0.log" Mar 10 15:12:53 crc kubenswrapper[4911]: I0310 15:12:53.156749 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7pbw6h_f82c1a17-4dc8-48c2-9bc2-9d7168524de3/manager/0.log" Mar 10 15:12:53 crc kubenswrapper[4911]: I0310 15:12:53.254106 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-554774d6c8-bpzx4_f34f11f5-13c7-426d-b30b-127ddd115a17/operator/0.log" Mar 10 15:12:53 crc kubenswrapper[4911]: I0310 15:12:53.412659 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5f7w8_2ccee19d-0e75-4358-87aa-16359f6bd2ee/registry-server/0.log" Mar 10 15:12:53 crc kubenswrapper[4911]: I0310 15:12:53.967504 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-2mnh2_a036efe0-e6cc-4ebe-8b06-70bc180b7b1c/manager/0.log" Mar 10 15:12:54 crc kubenswrapper[4911]: I0310 15:12:54.007073 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-khnvw_a07393e2-b210-4e68-8cd3-62a838f86071/manager/0.log" Mar 10 15:12:54 crc kubenswrapper[4911]: I0310 15:12:54.248153 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7n8pz_0b94f7c5-35a4-430f-bccb-011f386954d5/operator/0.log" Mar 10 15:12:54 crc kubenswrapper[4911]: I0310 15:12:54.413360 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-8jk8x_fca0a377-f77c-4e24-aec1-8ffb8ba87963/manager/0.log" Mar 10 15:12:54 crc kubenswrapper[4911]: I0310 15:12:54.573548 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-pgf4d_c8487a91-d6ca-480d-a451-35e6516bc9e8/manager/0.log" Mar 10 15:12:54 crc kubenswrapper[4911]: I0310 15:12:54.802018 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-wltgc_902c813a-1cba-4e57-9d1e-0e0a8ab0f6d6/manager/0.log" Mar 10 15:12:54 crc kubenswrapper[4911]: I0310 15:12:54.942536 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-fmrnr_ef5ab7b9-910d-4c79-9f03-ad4ce9fc6a20/manager/0.log" Mar 10 15:12:55 crc kubenswrapper[4911]: I0310 15:12:55.331399 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-774dfd9959-g5lwx_0fcc8b66-2a29-45c8-a445-a14770e3f157/manager/0.log" Mar 10 15:12:59 crc kubenswrapper[4911]: I0310 15:12:59.554929 4911 scope.go:117] "RemoveContainer" containerID="05c64525355e2e35fe082f05c60ccc834ba8560fb4e2479223ba37dae76e9c0d" Mar 10 15:12:59 crc kubenswrapper[4911]: I0310 15:12:59.817800 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-wv8p7_0dabe548-2d6c-4bbb-8199-6403e57d2ac9/manager/0.log" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.774206 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shs9x"] Mar 10 15:13:00 crc kubenswrapper[4911]: E0310 15:13:00.775426 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c" containerName="oc" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.775510 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c" containerName="oc" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.775776 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c" containerName="oc" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.777846 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.826648 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shs9x"] Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.859583 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-utilities\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.859683 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-catalog-content\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.859710 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngct2\" (UniqueName: \"kubernetes.io/projected/da83421d-1ba8-456e-9833-4d5afc7bf567-kube-api-access-ngct2\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.961687 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-utilities\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.961801 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-catalog-content\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.961828 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngct2\" (UniqueName: \"kubernetes.io/projected/da83421d-1ba8-456e-9833-4d5afc7bf567-kube-api-access-ngct2\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.962156 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-utilities\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:00 crc kubenswrapper[4911]: I0310 15:13:00.962441 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-catalog-content\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:01 crc kubenswrapper[4911]: I0310 15:13:01.025162 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngct2\" (UniqueName: \"kubernetes.io/projected/da83421d-1ba8-456e-9833-4d5afc7bf567-kube-api-access-ngct2\") pod \"certified-operators-shs9x\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:01 crc kubenswrapper[4911]: I0310 15:13:01.149564 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:01 crc kubenswrapper[4911]: I0310 15:13:01.675955 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shs9x"] Mar 10 15:13:01 crc kubenswrapper[4911]: I0310 15:13:01.809027 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shs9x" event={"ID":"da83421d-1ba8-456e-9833-4d5afc7bf567","Type":"ContainerStarted","Data":"1470e1fcf91397cd02bb87879ba1f6180c30ed22a0a46085d9886f68b5161db3"} Mar 10 15:13:02 crc kubenswrapper[4911]: I0310 15:13:02.828494 4911 generic.go:334] "Generic (PLEG): container finished" podID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerID="dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401" exitCode=0 Mar 10 15:13:02 crc kubenswrapper[4911]: I0310 15:13:02.829074 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shs9x" event={"ID":"da83421d-1ba8-456e-9833-4d5afc7bf567","Type":"ContainerDied","Data":"dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401"} Mar 10 15:13:02 crc kubenswrapper[4911]: I0310 15:13:02.830963 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:13:04 crc kubenswrapper[4911]: I0310 15:13:04.858014 4911 generic.go:334] "Generic (PLEG): container finished" podID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerID="fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7" exitCode=0 Mar 10 15:13:04 crc kubenswrapper[4911]: I0310 15:13:04.858563 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shs9x" event={"ID":"da83421d-1ba8-456e-9833-4d5afc7bf567","Type":"ContainerDied","Data":"fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7"} Mar 10 15:13:05 crc kubenswrapper[4911]: I0310 15:13:05.872132 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shs9x" event={"ID":"da83421d-1ba8-456e-9833-4d5afc7bf567","Type":"ContainerStarted","Data":"566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5"} Mar 10 15:13:05 crc kubenswrapper[4911]: I0310 15:13:05.897397 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shs9x" podStartSLOduration=3.329483027 podStartE2EDuration="5.897376963s" podCreationTimestamp="2026-03-10 15:13:00 +0000 UTC" firstStartedPulling="2026-03-10 15:13:02.830708839 +0000 UTC m=+4287.394228756" lastFinishedPulling="2026-03-10 15:13:05.398602785 +0000 UTC m=+4289.962122692" observedRunningTime="2026-03-10 15:13:05.892611225 +0000 UTC m=+4290.456131142" watchObservedRunningTime="2026-03-10 15:13:05.897376963 +0000 UTC m=+4290.460896880" Mar 10 15:13:11 crc kubenswrapper[4911]: I0310 15:13:11.150588 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:11 crc kubenswrapper[4911]: I0310 15:13:11.151366 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:11 crc kubenswrapper[4911]: I0310 15:13:11.557458 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:11 crc kubenswrapper[4911]: I0310 15:13:11.987772 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:12 crc kubenswrapper[4911]: I0310 15:13:12.041305 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shs9x"] Mar 10 15:13:13 crc kubenswrapper[4911]: I0310 15:13:13.949376 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-shs9x" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerName="registry-server" containerID="cri-o://566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5" gracePeriod=2 Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.444657 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.592289 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-utilities\") pod \"da83421d-1ba8-456e-9833-4d5afc7bf567\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.592561 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-catalog-content\") pod \"da83421d-1ba8-456e-9833-4d5afc7bf567\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.592687 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngct2\" (UniqueName: \"kubernetes.io/projected/da83421d-1ba8-456e-9833-4d5afc7bf567-kube-api-access-ngct2\") pod \"da83421d-1ba8-456e-9833-4d5afc7bf567\" (UID: \"da83421d-1ba8-456e-9833-4d5afc7bf567\") " Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.593083 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-utilities" (OuterVolumeSpecName: "utilities") pod "da83421d-1ba8-456e-9833-4d5afc7bf567" (UID: "da83421d-1ba8-456e-9833-4d5afc7bf567"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.598332 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da83421d-1ba8-456e-9833-4d5afc7bf567-kube-api-access-ngct2" (OuterVolumeSpecName: "kube-api-access-ngct2") pod "da83421d-1ba8-456e-9833-4d5afc7bf567" (UID: "da83421d-1ba8-456e-9833-4d5afc7bf567"). InnerVolumeSpecName "kube-api-access-ngct2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.652686 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da83421d-1ba8-456e-9833-4d5afc7bf567" (UID: "da83421d-1ba8-456e-9833-4d5afc7bf567"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.695746 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.695778 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngct2\" (UniqueName: \"kubernetes.io/projected/da83421d-1ba8-456e-9833-4d5afc7bf567-kube-api-access-ngct2\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.695791 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83421d-1ba8-456e-9833-4d5afc7bf567-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.962161 4911 generic.go:334] "Generic (PLEG): container finished" podID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerID="566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5" exitCode=0 Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.962393 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shs9x" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.962426 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shs9x" event={"ID":"da83421d-1ba8-456e-9833-4d5afc7bf567","Type":"ContainerDied","Data":"566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5"} Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.962769 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shs9x" event={"ID":"da83421d-1ba8-456e-9833-4d5afc7bf567","Type":"ContainerDied","Data":"1470e1fcf91397cd02bb87879ba1f6180c30ed22a0a46085d9886f68b5161db3"} Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.962809 4911 scope.go:117] "RemoveContainer" containerID="566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5" Mar 10 15:13:14 crc kubenswrapper[4911]: I0310 15:13:14.985467 4911 scope.go:117] "RemoveContainer" containerID="fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7" Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.006533 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shs9x"] Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.017203 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-shs9x"] Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.032206 4911 scope.go:117] "RemoveContainer" containerID="dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401" Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.058808 4911 scope.go:117] "RemoveContainer" containerID="566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5" Mar 10 15:13:15 crc kubenswrapper[4911]: E0310 15:13:15.059553 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5\": container with ID starting with 566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5 not found: ID does not exist" containerID="566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5" Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.059588 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5"} err="failed to get container status \"566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5\": rpc error: code = NotFound desc = could not find container \"566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5\": container with ID starting with 566d11f7cf6822f88af8a7453e317d0793f934b4ea6c80b2830f85f4a2c7a3c5 not found: ID does not exist" Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.059618 4911 scope.go:117] "RemoveContainer" containerID="fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7" Mar 10 15:13:15 crc kubenswrapper[4911]: E0310 15:13:15.060030 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7\": container with ID starting with fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7 not found: ID does not exist" containerID="fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7" Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.060136 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7"} err="failed to get container status \"fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7\": rpc error: code = NotFound desc = could not find container \"fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7\": container with ID starting with fdf33b4bbaf1c920439372eb6e0bd86828bd6944aa50de716a860997322c6dc7 not found: ID does not exist" Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.060175 4911 scope.go:117] "RemoveContainer" containerID="dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401" Mar 10 15:13:15 crc kubenswrapper[4911]: E0310 15:13:15.060981 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401\": container with ID starting with dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401 not found: ID does not exist" containerID="dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401" Mar 10 15:13:15 crc kubenswrapper[4911]: I0310 15:13:15.061016 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401"} err="failed to get container status \"dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401\": rpc error: code = NotFound desc = could not find container \"dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401\": container with ID starting with dedb4cd49214874413a996f3b3447370088052da37daa7d03bda2efe858d0401 not found: ID does not exist" Mar 10 15:13:16 crc kubenswrapper[4911]: I0310 15:13:16.208157 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" path="/var/lib/kubelet/pods/da83421d-1ba8-456e-9833-4d5afc7bf567/volumes" Mar 10 15:13:18 crc kubenswrapper[4911]: I0310 15:13:18.520842 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:13:18 crc kubenswrapper[4911]: I0310 15:13:18.521231 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:13:18 crc kubenswrapper[4911]: I0310 15:13:18.658053 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4s5dj_61dbbc3f-94f4-4c65-8c0b-7181159fcae3/control-plane-machine-set-operator/0.log" Mar 10 15:13:18 crc kubenswrapper[4911]: I0310 15:13:18.860807 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s5jkn_39208142-b788-4b42-a0f2-421544f8833f/kube-rbac-proxy/0.log" Mar 10 15:13:18 crc kubenswrapper[4911]: I0310 15:13:18.881168 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s5jkn_39208142-b788-4b42-a0f2-421544f8833f/machine-api-operator/0.log" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.395575 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g9hvj"] Mar 10 15:13:23 crc kubenswrapper[4911]: E0310 15:13:23.397626 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerName="registry-server" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.397643 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerName="registry-server" Mar 10 15:13:23 crc kubenswrapper[4911]: E0310 15:13:23.397673 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerName="extract-content" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.397679 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerName="extract-content" Mar 10 15:13:23 crc kubenswrapper[4911]: E0310 15:13:23.397708 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerName="extract-utilities" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.397715 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerName="extract-utilities" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.397935 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="da83421d-1ba8-456e-9833-4d5afc7bf567" containerName="registry-server" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.399455 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.417512 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9hvj"] Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.490656 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-catalog-content\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.490990 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-utilities\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.491080 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwdvt\" (UniqueName: \"kubernetes.io/projected/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-kube-api-access-bwdvt\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.593397 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwdvt\" (UniqueName: \"kubernetes.io/projected/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-kube-api-access-bwdvt\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.593555 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-catalog-content\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.593583 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-utilities\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.594123 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-utilities\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:23 crc kubenswrapper[4911]: I0310 15:13:23.594149 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-catalog-content\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:24 crc kubenswrapper[4911]: I0310 15:13:24.023933 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwdvt\" (UniqueName: \"kubernetes.io/projected/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-kube-api-access-bwdvt\") pod \"redhat-marketplace-g9hvj\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:24 crc kubenswrapper[4911]: I0310 15:13:24.320531 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:24 crc kubenswrapper[4911]: I0310 15:13:24.816320 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9hvj"] Mar 10 15:13:25 crc kubenswrapper[4911]: I0310 15:13:25.069588 4911 generic.go:334] "Generic (PLEG): container finished" podID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerID="19ff384691ccc985d270732df1e0779d4142dba93ee628518f32fea1679e528e" exitCode=0 Mar 10 15:13:25 crc kubenswrapper[4911]: I0310 15:13:25.069648 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9hvj" event={"ID":"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b","Type":"ContainerDied","Data":"19ff384691ccc985d270732df1e0779d4142dba93ee628518f32fea1679e528e"} Mar 10 15:13:25 crc kubenswrapper[4911]: I0310 15:13:25.069682 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9hvj" event={"ID":"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b","Type":"ContainerStarted","Data":"765d064097cab85005c44e1a53ec5f6e13040e4f5c63c9efdf182c5029f08426"} Mar 10 15:13:27 crc kubenswrapper[4911]: I0310 15:13:27.093241 4911 generic.go:334] "Generic (PLEG): container finished" podID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerID="cf24e748187738d76071d2cf6558c22d49bf0f94f03a1f3f8e58dd983674fc4e" exitCode=0 Mar 10 15:13:27 crc kubenswrapper[4911]: I0310 15:13:27.093322 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9hvj" event={"ID":"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b","Type":"ContainerDied","Data":"cf24e748187738d76071d2cf6558c22d49bf0f94f03a1f3f8e58dd983674fc4e"} Mar 10 15:13:28 crc kubenswrapper[4911]: I0310 15:13:28.112626 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9hvj" event={"ID":"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b","Type":"ContainerStarted","Data":"f5aa569190cf976ab6fa352e3f4096cbcfc36ce4f674bf9eaf17f5ed2fe788c9"} Mar 10 15:13:28 crc kubenswrapper[4911]: I0310 15:13:28.139567 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g9hvj" podStartSLOduration=2.718498841 podStartE2EDuration="5.139535375s" podCreationTimestamp="2026-03-10 15:13:23 +0000 UTC" firstStartedPulling="2026-03-10 15:13:25.072995254 +0000 UTC m=+4309.636515171" lastFinishedPulling="2026-03-10 15:13:27.494031768 +0000 UTC m=+4312.057551705" observedRunningTime="2026-03-10 15:13:28.134270753 +0000 UTC m=+4312.697790670" watchObservedRunningTime="2026-03-10 15:13:28.139535375 +0000 UTC m=+4312.703055332" Mar 10 15:13:34 crc kubenswrapper[4911]: I0310 15:13:34.148641 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-t952c_b05a33fd-fd6b-4b1b-ad0f-427586c8e81a/cert-manager-controller/0.log" Mar 10 15:13:34 crc kubenswrapper[4911]: I0310 15:13:34.320671 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:34 crc kubenswrapper[4911]: I0310 15:13:34.320834 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:34 crc kubenswrapper[4911]: I0310 15:13:34.387416 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:34 crc kubenswrapper[4911]: I0310 15:13:34.389281 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-5bw5t_8d1ebd76-111d-461e-8031-13d1071d1e64/cert-manager-cainjector/0.log" Mar 10 15:13:34 crc kubenswrapper[4911]: I0310 15:13:34.404931 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-2fm2x_90feb50a-5cbc-4a77-b328-65b1f3adefc0/cert-manager-webhook/0.log" Mar 10 15:13:35 crc kubenswrapper[4911]: I0310 15:13:35.232104 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:35 crc kubenswrapper[4911]: I0310 15:13:35.289216 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9hvj"] Mar 10 15:13:37 crc kubenswrapper[4911]: I0310 15:13:37.202163 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g9hvj" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerName="registry-server" containerID="cri-o://f5aa569190cf976ab6fa352e3f4096cbcfc36ce4f674bf9eaf17f5ed2fe788c9" gracePeriod=2 Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.215555 4911 generic.go:334] "Generic (PLEG): container finished" podID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerID="f5aa569190cf976ab6fa352e3f4096cbcfc36ce4f674bf9eaf17f5ed2fe788c9" exitCode=0 Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.215650 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9hvj" event={"ID":"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b","Type":"ContainerDied","Data":"f5aa569190cf976ab6fa352e3f4096cbcfc36ce4f674bf9eaf17f5ed2fe788c9"} Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.430048 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.534683 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-utilities\") pod \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.534778 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwdvt\" (UniqueName: \"kubernetes.io/projected/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-kube-api-access-bwdvt\") pod \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.534818 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-catalog-content\") pod \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\" (UID: \"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b\") " Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.536088 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-utilities" (OuterVolumeSpecName: "utilities") pod "8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" (UID: "8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.542125 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-kube-api-access-bwdvt" (OuterVolumeSpecName: "kube-api-access-bwdvt") pod "8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" (UID: "8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b"). InnerVolumeSpecName "kube-api-access-bwdvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.561796 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" (UID: "8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.637950 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.637999 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:38 crc kubenswrapper[4911]: I0310 15:13:38.638015 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwdvt\" (UniqueName: \"kubernetes.io/projected/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b-kube-api-access-bwdvt\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:39 crc kubenswrapper[4911]: I0310 15:13:39.230028 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9hvj" event={"ID":"8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b","Type":"ContainerDied","Data":"765d064097cab85005c44e1a53ec5f6e13040e4f5c63c9efdf182c5029f08426"} Mar 10 15:13:39 crc kubenswrapper[4911]: I0310 15:13:39.230625 4911 scope.go:117] "RemoveContainer" containerID="f5aa569190cf976ab6fa352e3f4096cbcfc36ce4f674bf9eaf17f5ed2fe788c9" Mar 10 15:13:39 crc kubenswrapper[4911]: I0310 15:13:39.230129 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9hvj" Mar 10 15:13:39 crc kubenswrapper[4911]: I0310 15:13:39.266523 4911 scope.go:117] "RemoveContainer" containerID="cf24e748187738d76071d2cf6558c22d49bf0f94f03a1f3f8e58dd983674fc4e" Mar 10 15:13:39 crc kubenswrapper[4911]: I0310 15:13:39.295364 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9hvj"] Mar 10 15:13:39 crc kubenswrapper[4911]: I0310 15:13:39.299633 4911 scope.go:117] "RemoveContainer" containerID="19ff384691ccc985d270732df1e0779d4142dba93ee628518f32fea1679e528e" Mar 10 15:13:39 crc kubenswrapper[4911]: I0310 15:13:39.304314 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9hvj"] Mar 10 15:13:40 crc kubenswrapper[4911]: I0310 15:13:40.205383 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" path="/var/lib/kubelet/pods/8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b/volumes" Mar 10 15:13:48 crc kubenswrapper[4911]: I0310 15:13:48.173726 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-dw2wk_c9cf4f47-5446-49ed-95d1-b3e9322ce43e/nmstate-console-plugin/0.log" Mar 10 15:13:48 crc kubenswrapper[4911]: I0310 15:13:48.365097 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-v89h8_85141c5d-b88f-4970-a62f-e826726facc1/nmstate-handler/0.log" Mar 10 15:13:48 crc kubenswrapper[4911]: I0310 15:13:48.520494 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:13:48 crc kubenswrapper[4911]: I0310 15:13:48.520564 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:13:48 crc kubenswrapper[4911]: I0310 15:13:48.942342 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-hj6l2_892d43d8-c12a-47b3-8056-d0d6024e961e/nmstate-metrics/0.log" Mar 10 15:13:48 crc kubenswrapper[4911]: I0310 15:13:48.955925 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-hj6l2_892d43d8-c12a-47b3-8056-d0d6024e961e/kube-rbac-proxy/0.log" Mar 10 15:13:49 crc kubenswrapper[4911]: I0310 15:13:49.136461 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-rr2pg_9d21e056-f31e-4a31-a5fe-1543fd7dbc98/nmstate-webhook/0.log" Mar 10 15:13:49 crc kubenswrapper[4911]: I0310 15:13:49.142394 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-rdddd_fdc52c27-9268-47bd-b07e-8d9995db81bb/nmstate-operator/0.log" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.159767 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552594-2q58h"] Mar 10 15:14:00 crc kubenswrapper[4911]: E0310 15:14:00.160959 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerName="extract-utilities" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.160977 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerName="extract-utilities" Mar 10 15:14:00 crc kubenswrapper[4911]: E0310 15:14:00.160996 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerName="extract-content" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.161003 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerName="extract-content" Mar 10 15:14:00 crc kubenswrapper[4911]: E0310 15:14:00.161011 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerName="registry-server" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.161017 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerName="registry-server" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.161227 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9bc4a2-c62e-43f0-8ee3-4963084c8c4b" containerName="registry-server" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.162111 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-2q58h" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.164461 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.165074 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.165348 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.188975 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-2q58h"] Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.226440 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sps8m\" (UniqueName: \"kubernetes.io/projected/11ccc6af-28b1-4ca9-a41e-0032ab6a5be4-kube-api-access-sps8m\") pod \"auto-csr-approver-29552594-2q58h\" (UID: \"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4\") " pod="openshift-infra/auto-csr-approver-29552594-2q58h" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.328609 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sps8m\" (UniqueName: \"kubernetes.io/projected/11ccc6af-28b1-4ca9-a41e-0032ab6a5be4-kube-api-access-sps8m\") pod \"auto-csr-approver-29552594-2q58h\" (UID: \"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4\") " pod="openshift-infra/auto-csr-approver-29552594-2q58h" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.354453 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sps8m\" (UniqueName: \"kubernetes.io/projected/11ccc6af-28b1-4ca9-a41e-0032ab6a5be4-kube-api-access-sps8m\") pod \"auto-csr-approver-29552594-2q58h\" (UID: \"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4\") " pod="openshift-infra/auto-csr-approver-29552594-2q58h" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.485539 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-2q58h" Mar 10 15:14:00 crc kubenswrapper[4911]: I0310 15:14:00.926888 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-2q58h"] Mar 10 15:14:01 crc kubenswrapper[4911]: I0310 15:14:01.436641 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-2q58h" event={"ID":"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4","Type":"ContainerStarted","Data":"714f5ad49bdc818d2f357664d88406f7b977158b636f8eb1cc6b3f8eb8f5d1e8"} Mar 10 15:14:02 crc kubenswrapper[4911]: I0310 15:14:02.452144 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-2q58h" event={"ID":"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4","Type":"ContainerStarted","Data":"a3fc77b8614d470816f6e05b0bb12c6636f22bf326c509752c9a77861f9c408f"} Mar 10 15:14:02 crc kubenswrapper[4911]: I0310 15:14:02.501341 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552594-2q58h" podStartSLOduration=1.5410882209999999 podStartE2EDuration="2.501310685s" podCreationTimestamp="2026-03-10 15:14:00 +0000 UTC" firstStartedPulling="2026-03-10 15:14:00.930675826 +0000 UTC m=+4345.494195733" lastFinishedPulling="2026-03-10 15:14:01.89089828 +0000 UTC m=+4346.454418197" observedRunningTime="2026-03-10 15:14:02.477581758 +0000 UTC m=+4347.041101675" watchObservedRunningTime="2026-03-10 15:14:02.501310685 +0000 UTC m=+4347.064830602" Mar 10 15:14:03 crc kubenswrapper[4911]: I0310 15:14:03.463417 4911 generic.go:334] "Generic (PLEG): container finished" podID="11ccc6af-28b1-4ca9-a41e-0032ab6a5be4" containerID="a3fc77b8614d470816f6e05b0bb12c6636f22bf326c509752c9a77861f9c408f" exitCode=0 Mar 10 15:14:03 crc kubenswrapper[4911]: I0310 15:14:03.463471 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-2q58h" event={"ID":"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4","Type":"ContainerDied","Data":"a3fc77b8614d470816f6e05b0bb12c6636f22bf326c509752c9a77861f9c408f"} Mar 10 15:14:04 crc kubenswrapper[4911]: I0310 15:14:04.861326 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-2q58h" Mar 10 15:14:04 crc kubenswrapper[4911]: I0310 15:14:04.929240 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sps8m\" (UniqueName: \"kubernetes.io/projected/11ccc6af-28b1-4ca9-a41e-0032ab6a5be4-kube-api-access-sps8m\") pod \"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4\" (UID: \"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4\") " Mar 10 15:14:04 crc kubenswrapper[4911]: I0310 15:14:04.940807 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ccc6af-28b1-4ca9-a41e-0032ab6a5be4-kube-api-access-sps8m" (OuterVolumeSpecName: "kube-api-access-sps8m") pod "11ccc6af-28b1-4ca9-a41e-0032ab6a5be4" (UID: "11ccc6af-28b1-4ca9-a41e-0032ab6a5be4"). InnerVolumeSpecName "kube-api-access-sps8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:14:05 crc kubenswrapper[4911]: I0310 15:14:05.031820 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sps8m\" (UniqueName: \"kubernetes.io/projected/11ccc6af-28b1-4ca9-a41e-0032ab6a5be4-kube-api-access-sps8m\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:05 crc kubenswrapper[4911]: I0310 15:14:05.486516 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-2q58h" event={"ID":"11ccc6af-28b1-4ca9-a41e-0032ab6a5be4","Type":"ContainerDied","Data":"714f5ad49bdc818d2f357664d88406f7b977158b636f8eb1cc6b3f8eb8f5d1e8"} Mar 10 15:14:05 crc kubenswrapper[4911]: I0310 15:14:05.486567 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="714f5ad49bdc818d2f357664d88406f7b977158b636f8eb1cc6b3f8eb8f5d1e8" Mar 10 15:14:05 crc kubenswrapper[4911]: I0310 15:14:05.486576 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-2q58h" Mar 10 15:14:05 crc kubenswrapper[4911]: I0310 15:14:05.555278 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-vfx7l"] Mar 10 15:14:05 crc kubenswrapper[4911]: I0310 15:14:05.565685 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-vfx7l"] Mar 10 15:14:06 crc kubenswrapper[4911]: I0310 15:14:06.206670 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c685a3-6149-4c66-aec6-48ce0436ac47" path="/var/lib/kubelet/pods/59c685a3-6149-4c66-aec6-48ce0436ac47/volumes" Mar 10 15:14:18 crc kubenswrapper[4911]: I0310 15:14:18.520602 4911 patch_prober.go:28] interesting pod/machine-config-daemon-tg8sx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:14:18 crc kubenswrapper[4911]: I0310 15:14:18.521400 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:14:18 crc kubenswrapper[4911]: I0310 15:14:18.521461 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" Mar 10 15:14:18 crc kubenswrapper[4911]: I0310 15:14:18.522388 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5"} pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:14:18 crc kubenswrapper[4911]: I0310 15:14:18.522462 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerName="machine-config-daemon" containerID="cri-o://8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" gracePeriod=600 Mar 10 15:14:18 crc kubenswrapper[4911]: E0310 15:14:18.651722 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:14:18 crc kubenswrapper[4911]: I0310 15:14:18.844768 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-78clj_1220afb2-8f3a-4b0c-8b88-9e690005eaf2/kube-rbac-proxy/0.log" Mar 10 15:14:18 crc kubenswrapper[4911]: I0310 15:14:18.945958 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-78clj_1220afb2-8f3a-4b0c-8b88-9e690005eaf2/controller/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.072660 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-frr-files/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.251096 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-frr-files/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.310465 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-reloader/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.312265 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-metrics/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.349003 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-reloader/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.517717 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-reloader/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.521177 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-metrics/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.525176 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-frr-files/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.576947 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-metrics/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.626853 4911 generic.go:334] "Generic (PLEG): container finished" podID="e2970cff-e2bc-40e6-9d80-7388d88e840e" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" exitCode=0 Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.626900 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerDied","Data":"8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5"} Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.626958 4911 scope.go:117] "RemoveContainer" containerID="07403a5bc42ca34f5ff0c49121e95c611f228b955cde52cc7ca9346c9d02ab86" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.627771 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:14:19 crc kubenswrapper[4911]: E0310 15:14:19.628036 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.813422 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-frr-files/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.850690 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-reloader/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.851764 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/cp-metrics/0.log" Mar 10 15:14:19 crc kubenswrapper[4911]: I0310 15:14:19.871064 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/controller/0.log" Mar 10 15:14:20 crc kubenswrapper[4911]: I0310 15:14:20.040814 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/frr-metrics/0.log" Mar 10 15:14:20 crc kubenswrapper[4911]: I0310 15:14:20.045300 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/kube-rbac-proxy/0.log" Mar 10 15:14:20 crc kubenswrapper[4911]: I0310 15:14:20.107465 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/kube-rbac-proxy-frr/0.log" Mar 10 15:14:20 crc kubenswrapper[4911]: I0310 15:14:20.350317 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/reloader/0.log" Mar 10 15:14:20 crc kubenswrapper[4911]: I0310 15:14:20.394187 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-npzpx_26684eb9-05cb-450a-b9b3-225b34518a92/frr-k8s-webhook-server/0.log" Mar 10 15:14:21 crc kubenswrapper[4911]: I0310 15:14:21.146715 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6555f87c79-t9n77_bca856e8-824f-41ea-999f-353b10773511/webhook-server/0.log" Mar 10 15:14:21 crc kubenswrapper[4911]: I0310 15:14:21.166417 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d5b96ddc6-xbpp4_2ab11607-0b3a-4d76-bb7d-34fd3c9fa271/manager/0.log" Mar 10 15:14:21 crc kubenswrapper[4911]: I0310 15:14:21.457352 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w9cpd_884e47b6-07a7-4d77-b73a-ffa7a9a59807/kube-rbac-proxy/0.log" Mar 10 15:14:21 crc kubenswrapper[4911]: I0310 15:14:21.591881 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6pqxq_ad777989-99eb-4ec3-91d2-890190261a26/frr/0.log" Mar 10 15:14:21 crc kubenswrapper[4911]: I0310 15:14:21.869555 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w9cpd_884e47b6-07a7-4d77-b73a-ffa7a9a59807/speaker/0.log" Mar 10 15:14:34 crc kubenswrapper[4911]: I0310 15:14:34.193330 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:14:34 crc kubenswrapper[4911]: E0310 15:14:34.194204 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:14:35 crc kubenswrapper[4911]: I0310 15:14:35.903608 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/util/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.152354 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/pull/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.165841 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/pull/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.181693 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/util/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.363348 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/pull/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.386845 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/util/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.389226 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82qrtqz_eb017921-d0db-4e33-b0aa-c02d2cf72ce3/extract/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.532759 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-utilities/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.692004 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-utilities/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.730079 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-content/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.733086 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-content/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.909922 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-utilities/0.log" Mar 10 15:14:36 crc kubenswrapper[4911]: I0310 15:14:36.947147 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/extract-content/0.log" Mar 10 15:14:37 crc kubenswrapper[4911]: I0310 15:14:37.135368 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-utilities/0.log" Mar 10 15:14:37 crc kubenswrapper[4911]: I0310 15:14:37.353941 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-utilities/0.log" Mar 10 15:14:37 crc kubenswrapper[4911]: I0310 15:14:37.429754 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-content/0.log" Mar 10 15:14:37 crc kubenswrapper[4911]: I0310 15:14:37.497643 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-content/0.log" Mar 10 15:14:37 crc kubenswrapper[4911]: I0310 15:14:37.646111 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-utilities/0.log" Mar 10 15:14:37 crc kubenswrapper[4911]: I0310 15:14:37.662671 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrkfz_94c0803f-7b0b-48e8-b19d-d81138d5fc10/registry-server/0.log" Mar 10 15:14:37 crc kubenswrapper[4911]: I0310 15:14:37.677871 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/extract-content/0.log" Mar 10 15:14:37 crc kubenswrapper[4911]: I0310 15:14:37.921084 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/util/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.144526 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/pull/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.164125 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/util/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.218326 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/pull/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.382098 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/util/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.426027 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/pull/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.439464 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dfhbk_1081e8d1-8f67-41ea-8fbb-a418473c68ca/registry-server/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.509542 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f48skp6_4f813eaf-714a-44ad-b1f7-b9b5d3c3ece3/extract/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.620799 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tvhkw_68abbbcf-c1ce-4be8-9252-9cd985160953/marketplace-operator/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.748650 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-utilities/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.919153 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-content/0.log" Mar 10 15:14:38 crc kubenswrapper[4911]: I0310 15:14:38.925484 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-utilities/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.011287 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-content/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.142079 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-content/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.164741 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/extract-utilities/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.302095 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kgjb4_07da101e-bfe5-4c96-b1ed-9a9b4bc7e6ef/registry-server/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.388920 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-utilities/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.557428 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-content/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.568177 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-utilities/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.585891 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-content/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.777247 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-utilities/0.log" Mar 10 15:14:39 crc kubenswrapper[4911]: I0310 15:14:39.806706 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/extract-content/0.log" Mar 10 15:14:40 crc kubenswrapper[4911]: I0310 15:14:40.302055 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kx5pg_db45043e-a5f4-4e42-a74b-a6477031d06d/registry-server/0.log" Mar 10 15:14:47 crc kubenswrapper[4911]: I0310 15:14:47.192946 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:14:47 crc kubenswrapper[4911]: E0310 15:14:47.195406 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:14:58 crc kubenswrapper[4911]: I0310 15:14:58.195963 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:14:58 crc kubenswrapper[4911]: E0310 15:14:58.196792 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:14:59 crc kubenswrapper[4911]: I0310 15:14:59.689745 4911 scope.go:117] "RemoveContainer" containerID="d7fb5c1bb7a41a70787011854386956fad8cbf6052fe82695bc11fce6dd62455" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.169252 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk"] Mar 10 15:15:00 crc kubenswrapper[4911]: E0310 15:15:00.169850 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ccc6af-28b1-4ca9-a41e-0032ab6a5be4" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.169876 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ccc6af-28b1-4ca9-a41e-0032ab6a5be4" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.170116 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ccc6af-28b1-4ca9-a41e-0032ab6a5be4" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.171213 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.174044 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.176081 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.212436 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk"] Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.250566 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sjrl\" (UniqueName: \"kubernetes.io/projected/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-kube-api-access-6sjrl\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.252626 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-secret-volume\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.253596 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-config-volume\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.357827 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-config-volume\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.357930 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sjrl\" (UniqueName: \"kubernetes.io/projected/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-kube-api-access-6sjrl\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.357991 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-secret-volume\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.359435 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-config-volume\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.364095 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-secret-volume\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.375686 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sjrl\" (UniqueName: \"kubernetes.io/projected/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-kube-api-access-6sjrl\") pod \"collect-profiles-29552595-sjmbk\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:00 crc kubenswrapper[4911]: I0310 15:15:00.491662 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:01 crc kubenswrapper[4911]: I0310 15:15:01.007136 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk"] Mar 10 15:15:01 crc kubenswrapper[4911]: I0310 15:15:01.089390 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" event={"ID":"d9e5d1e2-57ec-4c6d-8d54-9e025215e013","Type":"ContainerStarted","Data":"2ef7d8b1bb65e24bfb54bc3f4fdbe595e4a9e1d6b92d60b6efcccee44691d604"} Mar 10 15:15:02 crc kubenswrapper[4911]: I0310 15:15:02.100044 4911 generic.go:334] "Generic (PLEG): container finished" podID="d9e5d1e2-57ec-4c6d-8d54-9e025215e013" containerID="56cba3f1a87eddf1675e0e63752b723f08bed13c147ad6974498d75128c43ecb" exitCode=0 Mar 10 15:15:02 crc kubenswrapper[4911]: I0310 15:15:02.100258 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" event={"ID":"d9e5d1e2-57ec-4c6d-8d54-9e025215e013","Type":"ContainerDied","Data":"56cba3f1a87eddf1675e0e63752b723f08bed13c147ad6974498d75128c43ecb"} Mar 10 15:15:03 crc kubenswrapper[4911]: E0310 15:15:03.271241 4911 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.153:45120->38.102.83.153:36859: write tcp 38.102.83.153:45120->38.102.83.153:36859: write: broken pipe Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.513776 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.585021 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-secret-volume\") pod \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.585143 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sjrl\" (UniqueName: \"kubernetes.io/projected/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-kube-api-access-6sjrl\") pod \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.585311 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-config-volume\") pod \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\" (UID: \"d9e5d1e2-57ec-4c6d-8d54-9e025215e013\") " Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.591916 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9e5d1e2-57ec-4c6d-8d54-9e025215e013" (UID: "d9e5d1e2-57ec-4c6d-8d54-9e025215e013"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.598023 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-kube-api-access-6sjrl" (OuterVolumeSpecName: "kube-api-access-6sjrl") pod "d9e5d1e2-57ec-4c6d-8d54-9e025215e013" (UID: "d9e5d1e2-57ec-4c6d-8d54-9e025215e013"). InnerVolumeSpecName "kube-api-access-6sjrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.599343 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sjrl\" (UniqueName: \"kubernetes.io/projected/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-kube-api-access-6sjrl\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.599388 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.601445 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d9e5d1e2-57ec-4c6d-8d54-9e025215e013" (UID: "d9e5d1e2-57ec-4c6d-8d54-9e025215e013"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:15:03 crc kubenswrapper[4911]: I0310 15:15:03.701631 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9e5d1e2-57ec-4c6d-8d54-9e025215e013-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:04 crc kubenswrapper[4911]: I0310 15:15:04.147402 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" event={"ID":"d9e5d1e2-57ec-4c6d-8d54-9e025215e013","Type":"ContainerDied","Data":"2ef7d8b1bb65e24bfb54bc3f4fdbe595e4a9e1d6b92d60b6efcccee44691d604"} Mar 10 15:15:04 crc kubenswrapper[4911]: I0310 15:15:04.147628 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ef7d8b1bb65e24bfb54bc3f4fdbe595e4a9e1d6b92d60b6efcccee44691d604" Mar 10 15:15:04 crc kubenswrapper[4911]: I0310 15:15:04.147824 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-sjmbk" Mar 10 15:15:04 crc kubenswrapper[4911]: I0310 15:15:04.611371 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg"] Mar 10 15:15:04 crc kubenswrapper[4911]: I0310 15:15:04.623573 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552550-r4dfg"] Mar 10 15:15:06 crc kubenswrapper[4911]: I0310 15:15:06.210280 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9ea89f-fb27-4fac-b4f4-d4252dd9338a" path="/var/lib/kubelet/pods/6b9ea89f-fb27-4fac-b4f4-d4252dd9338a/volumes" Mar 10 15:15:13 crc kubenswrapper[4911]: I0310 15:15:13.193963 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:15:13 crc kubenswrapper[4911]: E0310 15:15:13.194949 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:15:26 crc kubenswrapper[4911]: I0310 15:15:26.202881 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:15:26 crc kubenswrapper[4911]: E0310 15:15:26.203927 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:15:39 crc kubenswrapper[4911]: I0310 15:15:39.195365 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:15:39 crc kubenswrapper[4911]: E0310 15:15:39.196403 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:15:54 crc kubenswrapper[4911]: I0310 15:15:54.193326 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:15:54 crc kubenswrapper[4911]: E0310 15:15:54.194199 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:15:59 crc kubenswrapper[4911]: I0310 15:15:59.823489 4911 scope.go:117] "RemoveContainer" containerID="fff3e4c39434663385030278f7b7fc196b9c22c4e96e3b80e198dc5a9f863b71" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.182144 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552596-kl8vw"] Mar 10 15:16:00 crc kubenswrapper[4911]: E0310 15:16:00.183218 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e5d1e2-57ec-4c6d-8d54-9e025215e013" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.183247 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e5d1e2-57ec-4c6d-8d54-9e025215e013" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.183567 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e5d1e2-57ec-4c6d-8d54-9e025215e013" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.184581 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.187908 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.188325 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.200007 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.214104 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-kl8vw"] Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.314178 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8r9c\" (UniqueName: \"kubernetes.io/projected/dd5d8c5a-515a-4813-819f-401041125d0b-kube-api-access-c8r9c\") pod \"auto-csr-approver-29552596-kl8vw\" (UID: \"dd5d8c5a-515a-4813-819f-401041125d0b\") " pod="openshift-infra/auto-csr-approver-29552596-kl8vw" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.416511 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8r9c\" (UniqueName: \"kubernetes.io/projected/dd5d8c5a-515a-4813-819f-401041125d0b-kube-api-access-c8r9c\") pod \"auto-csr-approver-29552596-kl8vw\" (UID: \"dd5d8c5a-515a-4813-819f-401041125d0b\") " pod="openshift-infra/auto-csr-approver-29552596-kl8vw" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.441063 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8r9c\" (UniqueName: \"kubernetes.io/projected/dd5d8c5a-515a-4813-819f-401041125d0b-kube-api-access-c8r9c\") pod \"auto-csr-approver-29552596-kl8vw\" (UID: \"dd5d8c5a-515a-4813-819f-401041125d0b\") " pod="openshift-infra/auto-csr-approver-29552596-kl8vw" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.514452 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" Mar 10 15:16:00 crc kubenswrapper[4911]: I0310 15:16:00.986467 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-kl8vw"] Mar 10 15:16:01 crc kubenswrapper[4911]: I0310 15:16:01.936995 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" event={"ID":"dd5d8c5a-515a-4813-819f-401041125d0b","Type":"ContainerStarted","Data":"8159c7722e1dba905fa0c1bf5b5c0ce6d24c7f5e5916f065546dc37f223ef4c7"} Mar 10 15:16:02 crc kubenswrapper[4911]: I0310 15:16:02.948562 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" event={"ID":"dd5d8c5a-515a-4813-819f-401041125d0b","Type":"ContainerStarted","Data":"5422324e5cd341a21b052becbf5ae52db169a7f9e103679714b4ff847ca41ce3"} Mar 10 15:16:02 crc kubenswrapper[4911]: I0310 15:16:02.973486 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" podStartSLOduration=1.508916563 podStartE2EDuration="2.973454463s" podCreationTimestamp="2026-03-10 15:16:00 +0000 UTC" firstStartedPulling="2026-03-10 15:16:00.998168714 +0000 UTC m=+4465.561688631" lastFinishedPulling="2026-03-10 15:16:02.462706614 +0000 UTC m=+4467.026226531" observedRunningTime="2026-03-10 15:16:02.967281017 +0000 UTC m=+4467.530800934" watchObservedRunningTime="2026-03-10 15:16:02.973454463 +0000 UTC m=+4467.536974390" Mar 10 15:16:03 crc kubenswrapper[4911]: I0310 15:16:03.962395 4911 generic.go:334] "Generic (PLEG): container finished" podID="dd5d8c5a-515a-4813-819f-401041125d0b" containerID="5422324e5cd341a21b052becbf5ae52db169a7f9e103679714b4ff847ca41ce3" exitCode=0 Mar 10 15:16:03 crc kubenswrapper[4911]: I0310 15:16:03.962482 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" event={"ID":"dd5d8c5a-515a-4813-819f-401041125d0b","Type":"ContainerDied","Data":"5422324e5cd341a21b052becbf5ae52db169a7f9e103679714b4ff847ca41ce3"} Mar 10 15:16:05 crc kubenswrapper[4911]: I0310 15:16:05.539010 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" Mar 10 15:16:05 crc kubenswrapper[4911]: I0310 15:16:05.564006 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8r9c\" (UniqueName: \"kubernetes.io/projected/dd5d8c5a-515a-4813-819f-401041125d0b-kube-api-access-c8r9c\") pod \"dd5d8c5a-515a-4813-819f-401041125d0b\" (UID: \"dd5d8c5a-515a-4813-819f-401041125d0b\") " Mar 10 15:16:05 crc kubenswrapper[4911]: I0310 15:16:05.624055 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5d8c5a-515a-4813-819f-401041125d0b-kube-api-access-c8r9c" (OuterVolumeSpecName: "kube-api-access-c8r9c") pod "dd5d8c5a-515a-4813-819f-401041125d0b" (UID: "dd5d8c5a-515a-4813-819f-401041125d0b"). InnerVolumeSpecName "kube-api-access-c8r9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:16:05 crc kubenswrapper[4911]: I0310 15:16:05.669484 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8r9c\" (UniqueName: \"kubernetes.io/projected/dd5d8c5a-515a-4813-819f-401041125d0b-kube-api-access-c8r9c\") on node \"crc\" DevicePath \"\"" Mar 10 15:16:05 crc kubenswrapper[4911]: I0310 15:16:05.990560 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" event={"ID":"dd5d8c5a-515a-4813-819f-401041125d0b","Type":"ContainerDied","Data":"8159c7722e1dba905fa0c1bf5b5c0ce6d24c7f5e5916f065546dc37f223ef4c7"} Mar 10 15:16:05 crc kubenswrapper[4911]: I0310 15:16:05.991009 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8159c7722e1dba905fa0c1bf5b5c0ce6d24c7f5e5916f065546dc37f223ef4c7" Mar 10 15:16:05 crc kubenswrapper[4911]: I0310 15:16:05.990817 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-kl8vw" Mar 10 15:16:06 crc kubenswrapper[4911]: I0310 15:16:06.072461 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-qcxxt"] Mar 10 15:16:06 crc kubenswrapper[4911]: I0310 15:16:06.090521 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-qcxxt"] Mar 10 15:16:06 crc kubenswrapper[4911]: I0310 15:16:06.219630 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85427124-2b71-420b-856e-863aa5c6c5be" path="/var/lib/kubelet/pods/85427124-2b71-420b-856e-863aa5c6c5be/volumes" Mar 10 15:16:08 crc kubenswrapper[4911]: I0310 15:16:08.193511 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:16:08 crc kubenswrapper[4911]: E0310 15:16:08.194083 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:16:20 crc kubenswrapper[4911]: I0310 15:16:20.205410 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:16:20 crc kubenswrapper[4911]: E0310 15:16:20.206229 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:16:33 crc kubenswrapper[4911]: I0310 15:16:33.194558 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:16:33 crc kubenswrapper[4911]: E0310 15:16:33.195632 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:16:41 crc kubenswrapper[4911]: I0310 15:16:41.420338 4911 generic.go:334] "Generic (PLEG): container finished" podID="0e94b793-17c7-4869-a3c5-089010867649" containerID="babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794" exitCode=0 Mar 10 15:16:41 crc kubenswrapper[4911]: I0310 15:16:41.420859 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" event={"ID":"0e94b793-17c7-4869-a3c5-089010867649","Type":"ContainerDied","Data":"babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794"} Mar 10 15:16:41 crc kubenswrapper[4911]: I0310 15:16:41.422928 4911 scope.go:117] "RemoveContainer" containerID="babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794" Mar 10 15:16:42 crc kubenswrapper[4911]: I0310 15:16:42.102059 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f8lcn_must-gather-c7x5z_0e94b793-17c7-4869-a3c5-089010867649/gather/0.log" Mar 10 15:16:46 crc kubenswrapper[4911]: I0310 15:16:46.199795 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:16:46 crc kubenswrapper[4911]: E0310 15:16:46.200599 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:16:52 crc kubenswrapper[4911]: I0310 15:16:52.880224 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f8lcn/must-gather-c7x5z"] Mar 10 15:16:52 crc kubenswrapper[4911]: I0310 15:16:52.881521 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" podUID="0e94b793-17c7-4869-a3c5-089010867649" containerName="copy" containerID="cri-o://94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba" gracePeriod=2 Mar 10 15:16:52 crc kubenswrapper[4911]: I0310 15:16:52.893417 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f8lcn/must-gather-c7x5z"] Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.317606 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f8lcn_must-gather-c7x5z_0e94b793-17c7-4869-a3c5-089010867649/copy/0.log" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.318803 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.470933 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vdpn\" (UniqueName: \"kubernetes.io/projected/0e94b793-17c7-4869-a3c5-089010867649-kube-api-access-7vdpn\") pod \"0e94b793-17c7-4869-a3c5-089010867649\" (UID: \"0e94b793-17c7-4869-a3c5-089010867649\") " Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.471335 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e94b793-17c7-4869-a3c5-089010867649-must-gather-output\") pod \"0e94b793-17c7-4869-a3c5-089010867649\" (UID: \"0e94b793-17c7-4869-a3c5-089010867649\") " Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.481309 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e94b793-17c7-4869-a3c5-089010867649-kube-api-access-7vdpn" (OuterVolumeSpecName: "kube-api-access-7vdpn") pod "0e94b793-17c7-4869-a3c5-089010867649" (UID: "0e94b793-17c7-4869-a3c5-089010867649"). InnerVolumeSpecName "kube-api-access-7vdpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.551658 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f8lcn_must-gather-c7x5z_0e94b793-17c7-4869-a3c5-089010867649/copy/0.log" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.552362 4911 generic.go:334] "Generic (PLEG): container finished" podID="0e94b793-17c7-4869-a3c5-089010867649" containerID="94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba" exitCode=143 Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.552439 4911 scope.go:117] "RemoveContainer" containerID="94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.552479 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f8lcn/must-gather-c7x5z" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.575064 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vdpn\" (UniqueName: \"kubernetes.io/projected/0e94b793-17c7-4869-a3c5-089010867649-kube-api-access-7vdpn\") on node \"crc\" DevicePath \"\"" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.577026 4911 scope.go:117] "RemoveContainer" containerID="babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.669136 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e94b793-17c7-4869-a3c5-089010867649-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0e94b793-17c7-4869-a3c5-089010867649" (UID: "0e94b793-17c7-4869-a3c5-089010867649"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.677251 4911 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e94b793-17c7-4869-a3c5-089010867649-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.692803 4911 scope.go:117] "RemoveContainer" containerID="94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba" Mar 10 15:16:53 crc kubenswrapper[4911]: E0310 15:16:53.693671 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba\": container with ID starting with 94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba not found: ID does not exist" containerID="94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.693754 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba"} err="failed to get container status \"94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba\": rpc error: code = NotFound desc = could not find container \"94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba\": container with ID starting with 94f05e50f22ec68767437d9ff1b30f8b142e945a8f6c8af4a9708151381607ba not found: ID does not exist" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.693791 4911 scope.go:117] "RemoveContainer" containerID="babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794" Mar 10 15:16:53 crc kubenswrapper[4911]: E0310 15:16:53.694431 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794\": container with ID starting with babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794 not found: ID does not exist" containerID="babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794" Mar 10 15:16:53 crc kubenswrapper[4911]: I0310 15:16:53.694485 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794"} err="failed to get container status \"babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794\": rpc error: code = NotFound desc = could not find container \"babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794\": container with ID starting with babda84e106fe9b7e9c3c760b78e4505e4ca8cb23c60721845386c3152361794 not found: ID does not exist" Mar 10 15:16:54 crc kubenswrapper[4911]: I0310 15:16:54.206943 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e94b793-17c7-4869-a3c5-089010867649" path="/var/lib/kubelet/pods/0e94b793-17c7-4869-a3c5-089010867649/volumes" Mar 10 15:16:59 crc kubenswrapper[4911]: I0310 15:16:59.194438 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:16:59 crc kubenswrapper[4911]: E0310 15:16:59.195490 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:16:59 crc kubenswrapper[4911]: I0310 15:16:59.887864 4911 scope.go:117] "RemoveContainer" containerID="8312e4d97553517362fd42ffd33f0040bcaa6a5e00af0fa1e94a31587db9f12f" Mar 10 15:16:59 crc kubenswrapper[4911]: I0310 15:16:59.935951 4911 scope.go:117] "RemoveContainer" containerID="679f4da8d87d6df31ddcf8234a0eb46da424cd2641ff21b98d49c3b14f092cbd" Mar 10 15:17:12 crc kubenswrapper[4911]: I0310 15:17:12.196521 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:17:12 crc kubenswrapper[4911]: E0310 15:17:12.197244 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:17:24 crc kubenswrapper[4911]: I0310 15:17:24.194367 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:17:24 crc kubenswrapper[4911]: E0310 15:17:24.195149 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:17:38 crc kubenswrapper[4911]: I0310 15:17:38.194441 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:17:38 crc kubenswrapper[4911]: E0310 15:17:38.195421 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:17:52 crc kubenswrapper[4911]: I0310 15:17:52.194000 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:17:52 crc kubenswrapper[4911]: E0310 15:17:52.195709 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.016244 4911 scope.go:117] "RemoveContainer" containerID="1a47eec5a53819fea44944bfce4685f58f8a8fb0398387dac6dd90bb6660c866" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.151517 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552598-p2fsl"] Mar 10 15:18:00 crc kubenswrapper[4911]: E0310 15:18:00.152158 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e94b793-17c7-4869-a3c5-089010867649" containerName="gather" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.152180 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e94b793-17c7-4869-a3c5-089010867649" containerName="gather" Mar 10 15:18:00 crc kubenswrapper[4911]: E0310 15:18:00.152199 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e94b793-17c7-4869-a3c5-089010867649" containerName="copy" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.152207 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e94b793-17c7-4869-a3c5-089010867649" containerName="copy" Mar 10 15:18:00 crc kubenswrapper[4911]: E0310 15:18:00.152234 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5d8c5a-515a-4813-819f-401041125d0b" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.152242 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5d8c5a-515a-4813-819f-401041125d0b" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.152519 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e94b793-17c7-4869-a3c5-089010867649" containerName="copy" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.152541 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5d8c5a-515a-4813-819f-401041125d0b" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.152567 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e94b793-17c7-4869-a3c5-089010867649" containerName="gather" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.153631 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.156457 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.157117 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.157192 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.161618 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-p2fsl"] Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.272687 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wq2t\" (UniqueName: \"kubernetes.io/projected/b3ed2d62-b80d-40f1-99d9-0631cbb0f93c-kube-api-access-2wq2t\") pod \"auto-csr-approver-29552598-p2fsl\" (UID: \"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c\") " pod="openshift-infra/auto-csr-approver-29552598-p2fsl" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.375265 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wq2t\" (UniqueName: \"kubernetes.io/projected/b3ed2d62-b80d-40f1-99d9-0631cbb0f93c-kube-api-access-2wq2t\") pod \"auto-csr-approver-29552598-p2fsl\" (UID: \"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c\") " pod="openshift-infra/auto-csr-approver-29552598-p2fsl" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.394304 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wq2t\" (UniqueName: \"kubernetes.io/projected/b3ed2d62-b80d-40f1-99d9-0631cbb0f93c-kube-api-access-2wq2t\") pod \"auto-csr-approver-29552598-p2fsl\" (UID: \"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c\") " pod="openshift-infra/auto-csr-approver-29552598-p2fsl" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.480793 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" Mar 10 15:18:00 crc kubenswrapper[4911]: I0310 15:18:00.939952 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-p2fsl"] Mar 10 15:18:01 crc kubenswrapper[4911]: I0310 15:18:01.290447 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" event={"ID":"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c","Type":"ContainerStarted","Data":"c11814fc6f8c189203ada86014772dc37064cd6190bda40aea4b83276fd015e7"} Mar 10 15:18:02 crc kubenswrapper[4911]: I0310 15:18:02.302833 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" event={"ID":"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c","Type":"ContainerStarted","Data":"7b2428a83dc5ca019ac1b3e52d295d199965ff2225b944928308606e415ca0ea"} Mar 10 15:18:02 crc kubenswrapper[4911]: I0310 15:18:02.318188 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" podStartSLOduration=1.457214803 podStartE2EDuration="2.318168442s" podCreationTimestamp="2026-03-10 15:18:00 +0000 UTC" firstStartedPulling="2026-03-10 15:18:00.948151168 +0000 UTC m=+4585.511671125" lastFinishedPulling="2026-03-10 15:18:01.809104847 +0000 UTC m=+4586.372624764" observedRunningTime="2026-03-10 15:18:02.313677861 +0000 UTC m=+4586.877197778" watchObservedRunningTime="2026-03-10 15:18:02.318168442 +0000 UTC m=+4586.881688359" Mar 10 15:18:03 crc kubenswrapper[4911]: I0310 15:18:03.319008 4911 generic.go:334] "Generic (PLEG): container finished" podID="b3ed2d62-b80d-40f1-99d9-0631cbb0f93c" containerID="7b2428a83dc5ca019ac1b3e52d295d199965ff2225b944928308606e415ca0ea" exitCode=0 Mar 10 15:18:03 crc kubenswrapper[4911]: I0310 15:18:03.319055 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" event={"ID":"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c","Type":"ContainerDied","Data":"7b2428a83dc5ca019ac1b3e52d295d199965ff2225b944928308606e415ca0ea"} Mar 10 15:18:04 crc kubenswrapper[4911]: I0310 15:18:04.193689 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:18:04 crc kubenswrapper[4911]: E0310 15:18:04.194503 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:18:04 crc kubenswrapper[4911]: I0310 15:18:04.691170 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" Mar 10 15:18:04 crc kubenswrapper[4911]: I0310 15:18:04.773108 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wq2t\" (UniqueName: \"kubernetes.io/projected/b3ed2d62-b80d-40f1-99d9-0631cbb0f93c-kube-api-access-2wq2t\") pod \"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c\" (UID: \"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c\") " Mar 10 15:18:04 crc kubenswrapper[4911]: I0310 15:18:04.779039 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ed2d62-b80d-40f1-99d9-0631cbb0f93c-kube-api-access-2wq2t" (OuterVolumeSpecName: "kube-api-access-2wq2t") pod "b3ed2d62-b80d-40f1-99d9-0631cbb0f93c" (UID: "b3ed2d62-b80d-40f1-99d9-0631cbb0f93c"). InnerVolumeSpecName "kube-api-access-2wq2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:18:04 crc kubenswrapper[4911]: I0310 15:18:04.876506 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wq2t\" (UniqueName: \"kubernetes.io/projected/b3ed2d62-b80d-40f1-99d9-0631cbb0f93c-kube-api-access-2wq2t\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:05 crc kubenswrapper[4911]: I0310 15:18:05.338075 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" event={"ID":"b3ed2d62-b80d-40f1-99d9-0631cbb0f93c","Type":"ContainerDied","Data":"c11814fc6f8c189203ada86014772dc37064cd6190bda40aea4b83276fd015e7"} Mar 10 15:18:05 crc kubenswrapper[4911]: I0310 15:18:05.338121 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-p2fsl" Mar 10 15:18:05 crc kubenswrapper[4911]: I0310 15:18:05.338131 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c11814fc6f8c189203ada86014772dc37064cd6190bda40aea4b83276fd015e7" Mar 10 15:18:05 crc kubenswrapper[4911]: I0310 15:18:05.402519 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-vcnvx"] Mar 10 15:18:05 crc kubenswrapper[4911]: I0310 15:18:05.414191 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-vcnvx"] Mar 10 15:18:06 crc kubenswrapper[4911]: I0310 15:18:06.209159 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c" path="/var/lib/kubelet/pods/e48f2fc5-1f72-41f1-94e8-7ac8b8bbf90c/volumes" Mar 10 15:18:19 crc kubenswrapper[4911]: I0310 15:18:19.193533 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:18:19 crc kubenswrapper[4911]: E0310 15:18:19.194490 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:18:34 crc kubenswrapper[4911]: I0310 15:18:34.193957 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:18:34 crc kubenswrapper[4911]: E0310 15:18:34.194819 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:18:48 crc kubenswrapper[4911]: I0310 15:18:48.193877 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:18:48 crc kubenswrapper[4911]: E0310 15:18:48.195218 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:19:00 crc kubenswrapper[4911]: I0310 15:19:00.077797 4911 scope.go:117] "RemoveContainer" containerID="3c349e829765ed9c371f7c3b7400deb2abbe6a8d7ceee9148f2bd4df79aa3c38" Mar 10 15:19:01 crc kubenswrapper[4911]: I0310 15:19:01.193972 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:19:01 crc kubenswrapper[4911]: E0310 15:19:01.194694 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:19:15 crc kubenswrapper[4911]: I0310 15:19:15.193963 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:19:15 crc kubenswrapper[4911]: E0310 15:19:15.195238 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tg8sx_openshift-machine-config-operator(e2970cff-e2bc-40e6-9d80-7388d88e840e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" podUID="e2970cff-e2bc-40e6-9d80-7388d88e840e" Mar 10 15:19:27 crc kubenswrapper[4911]: I0310 15:19:27.194335 4911 scope.go:117] "RemoveContainer" containerID="8a3a9bb99d1b53008e2f79f125c5ad59ad990d477f3a8f9d084ce898f4d9e1d5" Mar 10 15:19:28 crc kubenswrapper[4911]: I0310 15:19:28.164460 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tg8sx" event={"ID":"e2970cff-e2bc-40e6-9d80-7388d88e840e","Type":"ContainerStarted","Data":"08477e007d1e1152c3ff6c5b39c380f4864553ace136a533b666c8d248e044ce"} Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.081506 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2kz62"] Mar 10 15:19:45 crc kubenswrapper[4911]: E0310 15:19:45.082557 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ed2d62-b80d-40f1-99d9-0631cbb0f93c" containerName="oc" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.082571 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ed2d62-b80d-40f1-99d9-0631cbb0f93c" containerName="oc" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.082784 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ed2d62-b80d-40f1-99d9-0631cbb0f93c" containerName="oc" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.084264 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.093202 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhkkf\" (UniqueName: \"kubernetes.io/projected/40017348-4ef1-4b17-b308-2750d5214e6c-kube-api-access-nhkkf\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.093295 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-catalog-content\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.093325 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-utilities\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.101608 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2kz62"] Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.194955 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhkkf\" (UniqueName: \"kubernetes.io/projected/40017348-4ef1-4b17-b308-2750d5214e6c-kube-api-access-nhkkf\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.195036 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-catalog-content\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.195059 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-utilities\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.195784 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-utilities\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.195876 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-catalog-content\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.222238 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhkkf\" (UniqueName: \"kubernetes.io/projected/40017348-4ef1-4b17-b308-2750d5214e6c-kube-api-access-nhkkf\") pod \"redhat-operators-2kz62\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.406808 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:45 crc kubenswrapper[4911]: I0310 15:19:45.960029 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2kz62"] Mar 10 15:19:46 crc kubenswrapper[4911]: I0310 15:19:46.587529 4911 generic.go:334] "Generic (PLEG): container finished" podID="40017348-4ef1-4b17-b308-2750d5214e6c" containerID="54e466b4e8d5804df8597bdd6a7fabe088803a099eead4e1adb36b971b93af4a" exitCode=0 Mar 10 15:19:46 crc kubenswrapper[4911]: I0310 15:19:46.587846 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kz62" event={"ID":"40017348-4ef1-4b17-b308-2750d5214e6c","Type":"ContainerDied","Data":"54e466b4e8d5804df8597bdd6a7fabe088803a099eead4e1adb36b971b93af4a"} Mar 10 15:19:46 crc kubenswrapper[4911]: I0310 15:19:46.587880 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kz62" event={"ID":"40017348-4ef1-4b17-b308-2750d5214e6c","Type":"ContainerStarted","Data":"652bd27cc3b9646c1d100f544fc5310c6f25ea4690b10af7821f591656605128"} Mar 10 15:19:46 crc kubenswrapper[4911]: I0310 15:19:46.589860 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:19:48 crc kubenswrapper[4911]: I0310 15:19:48.610068 4911 generic.go:334] "Generic (PLEG): container finished" podID="40017348-4ef1-4b17-b308-2750d5214e6c" containerID="2608ce95bac40e104b0a4dbf216a7e257babf76bcd893809be0184dd2db70120" exitCode=0 Mar 10 15:19:48 crc kubenswrapper[4911]: I0310 15:19:48.610165 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kz62" event={"ID":"40017348-4ef1-4b17-b308-2750d5214e6c","Type":"ContainerDied","Data":"2608ce95bac40e104b0a4dbf216a7e257babf76bcd893809be0184dd2db70120"} Mar 10 15:19:49 crc kubenswrapper[4911]: I0310 15:19:49.623608 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kz62" event={"ID":"40017348-4ef1-4b17-b308-2750d5214e6c","Type":"ContainerStarted","Data":"ddc028f0094f3a59e43cb5ace59f83e1531edd39d0ea546a8c5a6b9516651776"} Mar 10 15:19:49 crc kubenswrapper[4911]: I0310 15:19:49.664569 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2kz62" podStartSLOduration=2.274317647 podStartE2EDuration="4.664541193s" podCreationTimestamp="2026-03-10 15:19:45 +0000 UTC" firstStartedPulling="2026-03-10 15:19:46.589601896 +0000 UTC m=+4691.153121813" lastFinishedPulling="2026-03-10 15:19:48.979825442 +0000 UTC m=+4693.543345359" observedRunningTime="2026-03-10 15:19:49.646634303 +0000 UTC m=+4694.210154220" watchObservedRunningTime="2026-03-10 15:19:49.664541193 +0000 UTC m=+4694.228061130" Mar 10 15:19:55 crc kubenswrapper[4911]: I0310 15:19:55.407835 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:55 crc kubenswrapper[4911]: I0310 15:19:55.408484 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:55 crc kubenswrapper[4911]: I0310 15:19:55.470505 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:56 crc kubenswrapper[4911]: I0310 15:19:56.271079 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:19:57 crc kubenswrapper[4911]: I0310 15:19:57.272562 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2kz62"] Mar 10 15:19:57 crc kubenswrapper[4911]: I0310 15:19:57.718212 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2kz62" podUID="40017348-4ef1-4b17-b308-2750d5214e6c" containerName="registry-server" containerID="cri-o://ddc028f0094f3a59e43cb5ace59f83e1531edd39d0ea546a8c5a6b9516651776" gracePeriod=2 Mar 10 15:19:59 crc kubenswrapper[4911]: I0310 15:19:59.741473 4911 generic.go:334] "Generic (PLEG): container finished" podID="40017348-4ef1-4b17-b308-2750d5214e6c" containerID="ddc028f0094f3a59e43cb5ace59f83e1531edd39d0ea546a8c5a6b9516651776" exitCode=0 Mar 10 15:19:59 crc kubenswrapper[4911]: I0310 15:19:59.741588 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kz62" event={"ID":"40017348-4ef1-4b17-b308-2750d5214e6c","Type":"ContainerDied","Data":"ddc028f0094f3a59e43cb5ace59f83e1531edd39d0ea546a8c5a6b9516651776"} Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.164508 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552600-7rfcc"] Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.166406 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.171223 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rdjn5" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.177184 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.177301 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.211421 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-7rfcc"] Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.239178 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnsst\" (UniqueName: \"kubernetes.io/projected/9e51330b-763e-4845-91ae-668106befedb-kube-api-access-bnsst\") pod \"auto-csr-approver-29552600-7rfcc\" (UID: \"9e51330b-763e-4845-91ae-668106befedb\") " pod="openshift-infra/auto-csr-approver-29552600-7rfcc" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.341628 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnsst\" (UniqueName: \"kubernetes.io/projected/9e51330b-763e-4845-91ae-668106befedb-kube-api-access-bnsst\") pod \"auto-csr-approver-29552600-7rfcc\" (UID: \"9e51330b-763e-4845-91ae-668106befedb\") " pod="openshift-infra/auto-csr-approver-29552600-7rfcc" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.367505 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnsst\" (UniqueName: \"kubernetes.io/projected/9e51330b-763e-4845-91ae-668106befedb-kube-api-access-bnsst\") pod \"auto-csr-approver-29552600-7rfcc\" (UID: \"9e51330b-763e-4845-91ae-668106befedb\") " pod="openshift-infra/auto-csr-approver-29552600-7rfcc" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.438514 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.444769 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-catalog-content\") pod \"40017348-4ef1-4b17-b308-2750d5214e6c\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.444978 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhkkf\" (UniqueName: \"kubernetes.io/projected/40017348-4ef1-4b17-b308-2750d5214e6c-kube-api-access-nhkkf\") pod \"40017348-4ef1-4b17-b308-2750d5214e6c\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.451042 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40017348-4ef1-4b17-b308-2750d5214e6c-kube-api-access-nhkkf" (OuterVolumeSpecName: "kube-api-access-nhkkf") pod "40017348-4ef1-4b17-b308-2750d5214e6c" (UID: "40017348-4ef1-4b17-b308-2750d5214e6c"). InnerVolumeSpecName "kube-api-access-nhkkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.494966 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.546871 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-utilities\") pod \"40017348-4ef1-4b17-b308-2750d5214e6c\" (UID: \"40017348-4ef1-4b17-b308-2750d5214e6c\") " Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.548175 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-utilities" (OuterVolumeSpecName: "utilities") pod "40017348-4ef1-4b17-b308-2750d5214e6c" (UID: "40017348-4ef1-4b17-b308-2750d5214e6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.548826 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhkkf\" (UniqueName: \"kubernetes.io/projected/40017348-4ef1-4b17-b308-2750d5214e6c-kube-api-access-nhkkf\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.639270 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40017348-4ef1-4b17-b308-2750d5214e6c" (UID: "40017348-4ef1-4b17-b308-2750d5214e6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.652796 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.652830 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40017348-4ef1-4b17-b308-2750d5214e6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.758269 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kz62" event={"ID":"40017348-4ef1-4b17-b308-2750d5214e6c","Type":"ContainerDied","Data":"652bd27cc3b9646c1d100f544fc5310c6f25ea4690b10af7821f591656605128"} Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.758334 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kz62" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.758361 4911 scope.go:117] "RemoveContainer" containerID="ddc028f0094f3a59e43cb5ace59f83e1531edd39d0ea546a8c5a6b9516651776" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.779996 4911 scope.go:117] "RemoveContainer" containerID="2608ce95bac40e104b0a4dbf216a7e257babf76bcd893809be0184dd2db70120" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.798704 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2kz62"] Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.807908 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2kz62"] Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.811750 4911 scope.go:117] "RemoveContainer" containerID="54e466b4e8d5804df8597bdd6a7fabe088803a099eead4e1adb36b971b93af4a" Mar 10 15:20:00 crc kubenswrapper[4911]: I0310 15:20:00.980774 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-7rfcc"] Mar 10 15:20:01 crc kubenswrapper[4911]: I0310 15:20:01.772217 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" event={"ID":"9e51330b-763e-4845-91ae-668106befedb","Type":"ContainerStarted","Data":"99909190f6ab1f23fa197673b2141570ed4e894b391562e793649b6010943d72"} Mar 10 15:20:02 crc kubenswrapper[4911]: I0310 15:20:02.208645 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40017348-4ef1-4b17-b308-2750d5214e6c" path="/var/lib/kubelet/pods/40017348-4ef1-4b17-b308-2750d5214e6c/volumes" Mar 10 15:20:02 crc kubenswrapper[4911]: I0310 15:20:02.787975 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" event={"ID":"9e51330b-763e-4845-91ae-668106befedb","Type":"ContainerStarted","Data":"a8a5adf6e1a49a50e7c5708ac8e9207f43325034783272cc651208a5c175d9a8"} Mar 10 15:20:02 crc kubenswrapper[4911]: I0310 15:20:02.815264 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" podStartSLOduration=1.453136182 podStartE2EDuration="2.815235537s" podCreationTimestamp="2026-03-10 15:20:00 +0000 UTC" firstStartedPulling="2026-03-10 15:20:00.984561983 +0000 UTC m=+4705.548081900" lastFinishedPulling="2026-03-10 15:20:02.346661338 +0000 UTC m=+4706.910181255" observedRunningTime="2026-03-10 15:20:02.805327821 +0000 UTC m=+4707.368847748" watchObservedRunningTime="2026-03-10 15:20:02.815235537 +0000 UTC m=+4707.378755464" Mar 10 15:20:03 crc kubenswrapper[4911]: I0310 15:20:03.808218 4911 generic.go:334] "Generic (PLEG): container finished" podID="9e51330b-763e-4845-91ae-668106befedb" containerID="a8a5adf6e1a49a50e7c5708ac8e9207f43325034783272cc651208a5c175d9a8" exitCode=0 Mar 10 15:20:03 crc kubenswrapper[4911]: I0310 15:20:03.808292 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" event={"ID":"9e51330b-763e-4845-91ae-668106befedb","Type":"ContainerDied","Data":"a8a5adf6e1a49a50e7c5708ac8e9207f43325034783272cc651208a5c175d9a8"} Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.186003 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.381366 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnsst\" (UniqueName: \"kubernetes.io/projected/9e51330b-763e-4845-91ae-668106befedb-kube-api-access-bnsst\") pod \"9e51330b-763e-4845-91ae-668106befedb\" (UID: \"9e51330b-763e-4845-91ae-668106befedb\") " Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.394576 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e51330b-763e-4845-91ae-668106befedb-kube-api-access-bnsst" (OuterVolumeSpecName: "kube-api-access-bnsst") pod "9e51330b-763e-4845-91ae-668106befedb" (UID: "9e51330b-763e-4845-91ae-668106befedb"). InnerVolumeSpecName "kube-api-access-bnsst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.483417 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnsst\" (UniqueName: \"kubernetes.io/projected/9e51330b-763e-4845-91ae-668106befedb-kube-api-access-bnsst\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.834351 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" event={"ID":"9e51330b-763e-4845-91ae-668106befedb","Type":"ContainerDied","Data":"99909190f6ab1f23fa197673b2141570ed4e894b391562e793649b6010943d72"} Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.834401 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99909190f6ab1f23fa197673b2141570ed4e894b391562e793649b6010943d72" Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.834419 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-7rfcc" Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.882108 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-2q58h"] Mar 10 15:20:05 crc kubenswrapper[4911]: I0310 15:20:05.891777 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-2q58h"] Mar 10 15:20:06 crc kubenswrapper[4911]: I0310 15:20:06.216378 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ccc6af-28b1-4ca9-a41e-0032ab6a5be4" path="/var/lib/kubelet/pods/11ccc6af-28b1-4ca9-a41e-0032ab6a5be4/volumes" Mar 10 15:21:00 crc kubenswrapper[4911]: I0310 15:21:00.186126 4911 scope.go:117] "RemoveContainer" containerID="a3fc77b8614d470816f6e05b0bb12c6636f22bf326c509752c9a77861f9c408f" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515154033365024452 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015154033365017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015154021572016507 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015154021572015457 5ustar corecore